hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fa1a69cc91f8f8c2dcd8ca810ba1537e683b9761 | 356 | md | Markdown | posts/2007/03/low-disk-space.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | null | null | null | posts/2007/03/low-disk-space.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | null | null | null | posts/2007/03/low-disk-space.md | atmos/tumblr.atmos.org | 1865e6fe271d4c28047ac50fd4ace154be411ff1 | [
"MIT"
] | 2 | 2019-05-06T18:02:23.000Z | 2019-05-06T18:27:47.000Z | <!--
id: 33353
link: http://tumblr.atmos.org/post/33353/low-disk-space
slug: low-disk-space
date: Sat Mar 03 2007 15:51:18 GMT-0800 (PST)
publish: 2007-03-03
tags:
title: low disk space
-->
low disk space
==============
[http://flickr.com/photos/jeremyhubert/409070599/in/photostream/](http://flickr.com/photos/jeremyhubert/409070599/in/photostream/)
| 20.941176 | 130 | 0.707865 | kor_Hang | 0.194606 |
fa1acdeef22d045d77febc6d274a5e3f84d0d522 | 741 | md | Markdown | includes/sample-powershell-install-no-ssh.md | YutongTie-MSFT/azure-docs.de-de | f7922d4a0ebfb2cbb31d7004d4f726202f39716b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/sample-powershell-install-no-ssh.md | YutongTie-MSFT/azure-docs.de-de | f7922d4a0ebfb2cbb31d7004d4f726202f39716b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/sample-powershell-install-no-ssh.md | YutongTie-MSFT/azure-docs.de-de | f7922d4a0ebfb2cbb31d7004d4f726202f39716b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: sptramer
ms.topic: include
ms.date: 01/30/2019
ms.service: azure-powershell
ms.author: sttramer
ms.openlocfilehash: f04a4ca8c0b160dc2bcc762cc1c570737dc945d5
ms.sourcegitcommit: f24fdd1ab23927c73595c960d8a26a74e1d12f5d
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 03/27/2019
ms.locfileid: "58505730"
---
Für dieses Beispiel ist Azure PowerShell erforderlich. Ermitteln Sie durch die Ausführung von `Get-Module -ListAvailable Az`, ob PowerShell installiert ist. Wenn Sie die Installation ausführen müssen, finden Sie unter [Installieren des Azure PowerShell-Moduls](/powershell/azure/install-az-ps) Informationen dazu.
Führen Sie zum Starten `Connect-AzAccount` aus, um eine Verbindung mit Azure herzustellen.
| 43.588235 | 314 | 0.820513 | deu_Latn | 0.856369 |
fa1b35ec29bacb533ab56f430ce1a65cf92e3266 | 25,603 | md | Markdown | windows-apps-src/audio-video-camera/process-media-frames-with-mediaframereader.md | toto6038/windows-uwp.zh-tw | 54f30beb577251dd10ff9398a40359d81e590cf9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-apps-src/audio-video-camera/process-media-frames-with-mediaframereader.md | toto6038/windows-uwp.zh-tw | 54f30beb577251dd10ff9398a40359d81e590cf9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-apps-src/audio-video-camera/process-media-frames-with-mediaframereader.md | toto6038/windows-uwp.zh-tw | 54f30beb577251dd10ff9398a40359d81e590cf9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.assetid: a128edc8-8a80-4645-ac29-908ede2d1c72
description: 本文章示範如何使用 MediaFrameReader 搭配 MediaCapture,從一或多個可用來源取得媒體畫面。來源包括色彩、深度及紅外線相機、音訊裝置,甚至是自訂畫面來源 (例如,能產生骨骼追蹤畫面的來源)。
title: 使用 MediaFrameReader 處理媒體畫面
ms.date: 02/08/2017
ms.topic: article
keywords: windows 10, uwp
ms.localizationpriority: medium
ms.openlocfilehash: b53103f0d0c67bd18b71ac94812f4cef53ca8ac0
ms.sourcegitcommit: c3ca68e87eb06971826087af59adb33e490ce7da
ms.translationtype: MT
ms.contentlocale: zh-TW
ms.lasthandoff: 09/02/2020
ms.locfileid: "89363811"
---
# <a name="process-media-frames-with-mediaframereader"></a>使用 MediaFrameReader 處理媒體畫面
本文章示範如何使用 [**MediaFrameReader**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameReader) 搭配 [**MediaCapture**](/uwp/api/Windows.Media.Capture.MediaCapture),從一或多個可用來源取得媒體畫面。來源包括色彩、深度及紅外線相機、音訊裝置,甚至是自訂畫面來源 (例如,能產生骨骼追蹤畫面的來源)。 此功能是針對要讓執行媒體畫面即時處理的 App 使用所設計,例如虛擬實境及深度感知相機 App。
如果您只對擷取視訊或相片感興趣,例如典型的相片應用程式,則您可能想要使用 [**MediaCapture**](/uwp/api/Windows.Media.Capture.MediaCapture) 支援的其他擷取技術的其中之一。 如需可用的媒體擷取技術和說明使用方式之文章的清單,請參閱[**相機**](camera.md)。
> [!NOTE]
> 本文中所討論的功能只從 Windows 10 版本 1607 開始提供。
> [!NOTE]
> 還有一個通用 Windows app 範例,示範使用 **MediaFrameReader** 顯示來自不同畫面來源 (包括色彩、深度與紅外線相機) 的畫面。 如需詳細資訊,請參閱[相機畫面範例](https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/CameraFrames)。
> [!NOTE]
> 使用 **MediaFrameReader** 搭配音訊資料的一組全新 API 在 Windows 10 版本 1803 中引進。 如需詳細資訊,請參閱[使用 MediaFrameReader 處理音訊框架](process-audio-frames-with-mediaframereader.md)。
## <a name="setting-up-your-project"></a>設定您的專案
就像任何使用 **MediaCapture** 的 App 一樣,您必須在嘗試存取任何相機裝置之前,宣告您的 App 是使用*網路攝影機*功能。 如果您的應用程式會從音訊裝置擷取,您也應該宣告*麥克風*裝置功能。
**將功能新增到應用程式資訊清單**
1. 在 Microsoft Visual Studio 的 **方案總管**中,按兩下 **package.appxmanifest** 專案以開啟應用程式資訊清單的設計工具。
2. 選取 [功能] 索引標籤。
3. 核取 [ **網路** 攝影機] 和 [ **麥克風**] 方塊的核取方塊。
4. 若要存取圖片和影片庫,請核取 **圖片媒體** 櫃的方塊和影片 **庫**的方塊。
除了預設專案範本所包含的 API 以外,這篇文章中的範例程式碼還會使用下列命名空間的 API。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetFramesUsing":::
## <a name="select-frame-sources-and-frame-source-groups"></a>選取畫面來源和畫面來源群組
許多處理媒體畫面的 App 需要一次從多個來源取得畫面,例如裝置的彩色和景深相機。 [**MediaFrameSourceGroup**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceGroup)物件代表一組可同時使用的媒體框架來源。 呼叫靜態方法 [**MediaFrameSourceGroup.FindAllAsync**](/uwp/api/windows.media.capture.frames.mediaframesourcegroup.findallasync),以取得目前裝置所支援之所有畫面來源群組的清單。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetFindAllAsync":::
您也可以使用[**DeviceInformation**](/uwp/api/windows.devices.enumeration.deviceinformation.createwatcher)建立[**DeviceWatcher**](/uwp/api/Windows.Devices.Enumeration.DeviceWatcher) ,以及在裝置上的可用畫面格來源群組變更時接收通知,例如當外部相機插入[**時。**](/uwp/api/windows.media.capture.frames.mediaframesourcegroup.getdeviceselector) 如需詳細資訊,請參閱[**列舉裝置**](../devices-sensors/enumerate-devices.md)。
[**MediaFrameSourceGroup**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceGroup) 有一個 [**MediaFrameSourceInfo**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceInfo) 物件的集合,可描述群組中包含的畫面來源。 在擷取裝置上可用的畫面來源群組之後,您可以選取公開您感興趣的畫面來源群組。
下列範例示範選取畫面來源群組最簡單的方式。 這個程式碼僅重複查看所有可用的群組,然後重複查看 [**SourceInfos**](/uwp/api/windows.media.capture.frames.mediaframesourcegroup.sourceinfos) 集合中的每個項目。 每個 **MediaFrameSourceInfo** 都會被檢查,以查看是否支援我們要尋找的功能。 在這個案例中,會針對值 [**VideoPreview**](/uwp/api/windows.media.capture.frames.mediaframesourceinfo.mediastreamtype) 檢查 [**MediaStreamType**](/uwp/api/Windows.Media.Capture.MediaStreamType) 屬性,表示裝置提供一個視訊預覽資料流,以及針對值 [**Color**](/uwp/api/windows.media.capture.frames.mediaframesourceinfo.sourcekind) 檢查 [**SourceKind**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceKind) 屬性,指示來源提供色彩畫面。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetSimpleSelect":::
這個識別想要的畫面來源群組和畫面來源的方法,適用於簡單的案例,但是如果您想要根據更複雜的條件來選取畫面來源,它可能很快就會變得很麻煩。 另一種方法是使用 Linq 語法和匿名物件來選取項目。 下列範例使用 **Select** 延伸方法,來將 *frameSourceGroups* 清單中的 **MediaFrameSourceGroup** 物件轉換為包含兩個欄位的匿名物件:*sourceGroup*,代表群組本身,以及 *colorSourceInfo*,代表群組中的色彩畫面來源。 *colorSourceInfo* 欄位會設定為 **FirstOrDefault** 的結果,這會選取所提供述詞解析為 true 的第一個物件。 在這個案例中,如果資料流類型為 **VideoPreview**、來源種類為 **Color**,且相機位於裝置的前方面板上,述詞為 true。
從上述查詢傳回的匿名物件清單,**Where** 延伸方法是僅用來選取那些 *colorSourceInfo* 欄位不是 null 的物件。 最後,會呼叫 **FirstOrDefault** 來選取清單中的第一個項目。
現在您可以使用已選取物件的欄位來取得對已選取的 **MediaFrameSourceGroup** 和代表色彩相機之 **MediaFrameSourceInfo** 物件的參考。 這些稍後將會用來初始化 **MediaCapture** 物件,並針對已選取的來源建立 **MediaFrameReader**。 最後,您應該測試以查看來源群組是否為 null,這表示目前的裝置沒有您要求的擷取來源。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetSelectColor":::
下列範例使用上述類似技術,來選取包含彩色、景深與紅外線相機的來源群組。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetColorInfraredDepth":::
> [!NOTE]
> 從 Windows 10 版本 1803 開始,您可以使用 [**MediaCaptureVideoProfile**](/uwp/api/Windows.Media.Capture.MediaCaptureVideoProfile) 類別來選取具有一組所需功能的媒體畫面來源。 如需詳細資訊,請參閱本文稍後的**選取畫面來源使用影片的設定檔**章節。
## <a name="initialize-the-mediacapture-object-to-use-the-selected-frame-source-group"></a>初始化 MediaCapture 物件,以使用所選的畫面來源群組
下一步是初始化 **MediaCapture** 物件,以使用您在上一個步驟中選取的畫面來源群組。
**MediaCapture** 物件通常會在您 App 內部的多個位置使用,因此您應該宣告一個類別成員變數來保存它。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetDeclareMediaCapture":::
藉由呼叫建構函式,建立 **MediaCapture** 物件的執行個體。 接著,建立將用來初始化**MediaCapture**物件的[**MediaCaptureInitializationSettings**](/uwp/api/windows.media.capture.mediacaptureinitializationsettings)物件。 在這個範例中,會使用下列設定︰
* [**SourceGroup**](/uwp/api/windows.media.capture.mediacaptureinitializationsettings.sourcegroup) - 這會告訴系統您將會使用哪一個來源群組取得畫面。 請記住,來源群組會定義一組可同時使用的媒體畫面來源。
* [**SharingMode**](/uwp/api/windows.media.capture.mediacaptureinitializationsettings.sharingmode) - 這會告訴系統您是否需要擷取來源裝置的專屬控制項。 如果您將此設定為 [**ExclusiveControl**](/uwp/api/Windows.Media.Capture.MediaCaptureSharingMode),就表示您可以變更擷取裝置的設定 (像是裝置所產生的畫面格式),但是這表示如果其他 App 已經有專屬控制項,當您的 App 嘗試初始化媒體擷取裝置時將會失敗。 如果您將此設定為 [**SharedReadOnly**](/uwp/api/Windows.Media.Capture.MediaCaptureSharingMode),即使畫面來源正由其他 App 使用,您也可以接收來自畫面來源的畫面,但您無法變更裝置的設定。
* [**MemoryPreference**](/uwp/api/windows.media.capture.mediacaptureinitializationsettings.memorypreference) - 如果您指定 [**CPU**](/uwp/api/Windows.Media.Capture.MediaCaptureMemoryPreference),系統將會使用 CPU 記憶體確保當畫面抵達時,畫面可以做為 [**SoftwareBitmap**](/uwp/api/Windows.Graphics.Imaging.SoftwareBitmap) 物件使用。 如果您指定 [**Auto**](/uwp/api/Windows.Media.Capture.MediaCaptureMemoryPreference),系統會以動態方式選擇最佳的記憶體位置來儲存畫面。 如果系統選擇使用 GPU 記憶體,媒體畫面會以 [**IDirect3DSurface**](/uwp/api/Windows.Graphics.DirectX.Direct3D11.IDirect3DSurface) 物件的方式抵達,而不是 **SoftwareBitmap**。
* [**StreamingCaptureMode**](/uwp/api/windows.media.capture.mediacaptureinitializationsettings.streamingcapturemode) - 將它設定為 [**Video**](/uwp/api/Windows.Media.Capture.StreamingCaptureMode) 以指示該音訊不需要串流處理。
呼叫 [**InitializeAsync**](/uwp/api/windows.media.capture.mediacapture.initializeasync),以使用您想要的設定將 **MediaCapture** 初始化。 請務必在 *try* 區塊中呼叫,以防初始化失敗。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetInitMediaCapture":::
## <a name="set-the-preferred-format-for-the-frame-source"></a>針對畫面來源設定慣用的格式
若要設定畫面來源的慣用格式,您必須取得代表來源的 [**MediaFrameSource**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSource) 物件。 您可以透過存取已初始化 **MediaCapture** 物件的 [**Frames**](/previous-versions/windows/apps/phone/jj207578(v=win.10)) 字典,指定您想要使用的畫面來源的識別碼來取得此物件。 這就是為什麼我們在選取畫面來源群組時會儲存 [**MediaFrameSourceInfo**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceInfo) 物件的原因。
[**MediaFrameSource.SupportedFormats**](/uwp/api/windows.media.capture.frames.mediaframesource.supportedformats) 屬性包含一份 [**MediaFrameFormat**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameFormat) 物件的清單,其中說明了畫面來源的支援格式。 使用 **Where** Linq 延伸方法,根據所需的屬性選取格式。 在這個範例中,會選取一個寬度為 1080 像素,並且可提供 32 位元 RGB 格式畫面的格式。 **FirstOrDefault** 延伸方法會選取清單中的第一個項目。 如果選取的格式為 null,那麼畫面來源就不支援要求的格式。 如果是支援的格式,您可透過呼叫 [**SetFormatAsync**](../develop/index.md),要求來源使用此格式。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetGetPreferredFormat":::
## <a name="create-a-frame-reader-for-the-frame-source"></a>建立畫面來源的畫面讀取程式
若要接收媒體畫面來源的畫面,請使用 [**MediaFrameReader**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameReader)。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetDeclareMediaFrameReader":::
藉由呼叫已初始化 **MediaCapture** 物件上的[**CreateFrameReaderAsync**](/uwp/api/windows.media.capture.mediacapture.createframereaderasync) 將畫面讀取程式具現化。 這個方法的第一個引數是您想要接收畫面的畫面來源。 您可以針對每個想要使用的畫面來源建立個別的畫面讀取程式。 第二個引數會告訴系統您想要畫面到達時使用的輸出格式。 這可以幫助您避免在畫面到達時必須自行轉換。 請注意,如果您指定畫面來源不支援的格式,會擲回例外狀況,因此請確定此值存在於 [**SupportedFormats**](/uwp/api/windows.media.capture.frames.mediaframesource.supportedformats) 集合中。
建立畫面讀取程式之後,請登錄 [**FrameArrived**](/uwp/api/windows.media.capture.frames.mediaframereader.framearrived) 事件的處理常式,只要來源有可用的新畫面時就會引發該事件。
透過呼叫 [**StartAsync**](/uwp/api/windows.media.capture.frames.mediaframereader.startasync) 告訴系統開始從來源讀取畫面。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetCreateFrameReader":::
## <a name="handle-the-frame-arrived-event"></a>處理畫面已到達事件
當有新畫面可用時,就會引發 [**MediaFrameReader.FrameArrived**](/uwp/api/windows.media.capture.frames.mediaframereader.framearrived) 事件。 您可以選擇處理每一個到達的畫面,或者僅在您需要時使用畫面。 因為畫面讀取程式會引發其本身執行緒上的事件,您可能需要實作部分同步邏輯來確定您沒有嘗試從多個執行緒存取相同的資料。 本節說明如何將繪圖色彩畫面同步到 XAML 頁面中的影像控制項。 本案例會解決要求在 UI 執行緒上執行所有 XAML 控制項更新的其他同步限制式。
在 XAML 中顯示畫面的第一步是建立影像控制項。
:::code language="xml" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml" id="SnippetImageElementXAML":::
在程式碼後置頁面中,宣告 **SoftwareBitmap** 類型的類別成員變數,這會做為後端緩衝區,以暫存所有複製的傳入影像。 請注意,影像資料本身不會複製,只會複製物件參考。 此外,請宣告布林值以追蹤我們的 UI 作業目前是否正在執行。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetDeclareBackBuffer":::
因為畫面會以 **SoftwareBitmap** 物件的方式到達,所以您必須建立 [**SoftwareBitmapSource**](/uwp/api/Windows.UI.Xaml.Media.Imaging.SoftwareBitmapSource) 物件,它可讓您使用 **SoftwareBitmap** 做為 XAML **控制項**的來源。 在您啟動畫面讀取程式之前,您應該在您程式碼中某處設定影像來源。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetImageElementSource":::
現在就可以實作 **FrameArrived** 事件處理常式。 呼叫這個處理常式時,*sender* 參數包含對引發事件的 **MediaFrameReader** 物件的參考。 在此物件上呼叫 [**TryAcquireLatestFrame**](/uwp/api/windows.media.capture.frames.mediaframereader.tryacquirelatestframe) 以嘗試取得最新畫面。 如同名稱所示,**TryAcquireLatestFrame** 可能無法成功傳回畫面。 因此,當您存取 VideoMediaFrame 和 SoftwareBitmap 屬性時,請確定針對 null 進行測試。 這個範例是 null 條件式運算子嗎 ? 是用來存取 **SoftwareBitmap**,然後會檢查擷取的物件是否為 null。
**Image** 控制項可以僅顯示預乘或無 Alpha 的 BRGA8 格式影像。 如果送達的畫面不是該格式,會使用靜態方法 [**Convert**](/uwp/api/windows.graphics.imaging.softwarebitmap.convert) 將軟體點陣圖轉換成正確的格式。
接下來,[**Interlocked.Exchange**](/dotnet/api/system.threading.interlocked.exchange#System_Threading_Interlocked_Exchange__1___0____0_) 方法會用來交換送達點陣圖的參考與後端緩衝區點陣圖。 這個方法會在安全執行緒的不可部分完成作業中交換這些參考。 交換之後,會處置舊的後端緩衝區影像 (現在在 *softwareBitmap* 變數中) 以清除其資源。
接下來,會使用與 **Image** 元素相關聯的 [**CoreDispatcher**](/uwp/api/Windows.UI.Core.CoreDispatcher) 來建立將透過呼叫 [**RunAsync**](/uwp/api/windows.ui.core.coredispatcher.runasync) 在 UI 執行緒上執行的工作。 因為非同步工作將會在工作中執行,所以 lambda 運算式會傳遞到使用 *async* 關鍵字宣告的 **RunAsync**。
在工作中,會檢查 *_taskRunning* 變數以確定一次只執行工作的一個執行個體。 如果工作尚未執行,*_taskRunning* 會設為 true,以避免再次執行工作。 在 *while* 迴圈中,會呼叫 **Interlocked.Exchange** 來從後端緩衝區複製到暫存 **SoftwareBitmap** 中,直到後端緩衝區影像為 null。 每次填入暫存點陣圖,**Image** 的 **Source** 屬性會轉換成 **SoftwareBitmapSource**,然後呼叫 [**SetBitmapAsync**](/uwp/api/windows.ui.xaml.media.imaging.softwarebitmapsource.setbitmapasync) 來設定影像來源。
最後,*_taskRunning* 變數會設定回 false,就可以在下一次呼叫處理常式時再次執行工作。
> [!NOTE]
> 如果您要存取 [**MediaFrameReference**](/uwp/api/windows.media.capture.frames.videomediaframe.softwarebitmap) 的 [**VideoMediaFrame**](/uwp/api/windows.media.capture.frames.videomediaframe.direct3dsurface) 屬性提供的 [**SoftwareBitmap**](/uwp/api/windows.media.capture.frames.mediaframereference.videomediaframe) 或 [**Direct3DSurface**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameReference) 物件,則系統會建立這些物件的強式參考,這表示當您在包含的 **MediaFrameReference** 上呼叫 [**Dispose**](/uwp/api/windows.media.capture.frames.mediaframereference.close) 時,他們不會被處置。 您必須針對要立即處置的物件明確地直接呼叫 **SoftwareBitmap** 或 **Direct3DSurface** 的 **Dispose** 方法。 否則,記憶體回收行程最終會釋放這些物件的記憶體,但您無法得知何時會釋放,而且如果配置的點陣圖或表面的數量超過系統允許的數量上限,新畫面的資料流就會停止。 您可以複製擷取的畫面 (例如使用 [**SoftwareBitmap.Copy**](/uwp/api/windows.graphics.imaging.softwarebitmap.copy) 方法),然後釋出原始畫面來克服這項限制。 此外,如果您使用多載 [CreateFrameReaderAsync(Windows.Media.Capture.Frames.MediaFrameSource inputSource, System.String outputSubtype, Windows.Graphics.Imaging.BitmapSize outputSize)](/uwp/api/windows.media.capture.mediacapture.createframereaderasync#Windows_Media_Capture_MediaCapture_CreateFrameReaderAsync_Windows_Media_Capture_Frames_MediaFrameSource_System_String_Windows_Graphics_Imaging_BitmapSize_) 或 [CreateFrameReaderAsync(Windows.Media.Capture.Frames.MediaFrameSource inputSource, System.String outputSubtype)](/uwp/api/windows.media.capture.mediacapture.createframereaderasync#Windows_Media_Capture_MediaCapture_CreateFrameReaderAsync_Windows_Media_Capture_Frames_MediaFrameSource_System_String_) 建立 **MediaFrameReader**,傳回的畫面會是原始畫面資料的複本,因此在保留畫面時不會造成畫面擷取停止。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetFrameArrived":::
## <a name="cleanup-resources"></a>清除資源
當您完成後讀取畫面時,請確定有透過呼叫 [**StopAsync**](/uwp/api/windows.media.capture.frames.mediaframereader.stopasync)、取消登錄 **FrameArrived** 處理常式,並處置 **MediaCapture** 物件,來停止媒體畫面讀取程式。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetCleanup":::
如需有關在您的應用程式暫停時清理媒體擷取物件的詳細資訊,請參閱[**顯示相機預覽**](simple-camera-preview-access.md)。
## <a name="the-framerenderer-helper-class"></a>FrameRenderer 協助程式類別
通用 Windows [相機畫面範例](https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/CameraFrames)提供了一個協助程式類別,可讓顯示來自您 App 中色彩、紅外線與深度來源的畫面變得更容易。 一般來說,您可能會想針對深度或紅外線資料多執行一些動作,而不僅是在螢幕上顯示,但是這個協助程式類別對於示範畫面讀取程式功能以及偵錯您自己的畫面讀取程式實作而言,是很有幫助的工具。
**FrameRenderer** 協助程式類別會實作下列方法。
* **FrameRenderer** 建構函式 - 建構函式會初始化協助程式類別,以使用您傳入的 XAML [**Image**](/uwp/api/Windows.UI.Xaml.Controls.Image) 元素來顯示媒體畫面。
* **ProcessFrame** - 這個方法會顯示您傳入建構函式的 **Image** 元素中,以 [**MediaFrameReference**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameReference) 表示的媒體畫面。 您通常應該從 [**FrameArrived**](/uwp/api/windows.media.capture.frames.mediaframereader.framearrived) 事件處理常式中呼叫此方法,傳入由 [**TryAcquireLatestFrame**](/uwp/api/windows.media.capture.frames.mediaframereader.tryacquirelatestframe) 傳回的畫面。
* **ConvertToDisplayableImage** - 這個方法會檢查媒體畫面的格式,必要時會將媒體畫面的格式轉換成可顯示的格式。 針對色彩影像,這代表確定色彩格式為 BGRA8 且已預乘點陣圖 Alpha 模式。 針對深度或紅外線畫面,會使用也包含於範例中的 **PsuedoColorHelper** 類別 (如下所示) 來處理每條掃描線,以將深度或紅外線畫面轉換為偽色彩漸層。
> [!NOTE]
> 為了在 **SoftwareBitmap** 影像上執行像素操作,您必須存取原生記憶體緩衝區。 若要這樣做,您必須使用包含於下面所列程式碼中的 IMemoryBufferByteAccess COM 介面,且您必須更新您的專案屬性,以允許編譯不安全的程式碼。 如需詳細資訊,請參閱[建立、編輯及儲存點陣圖影像](imaging.md)。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/FrameRenderer.cs" id="SnippetIMemoryBufferByteAccess":::
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/FrameRenderer.cs" id="SnippetFrameRenderer":::
## <a name="use-multisourcemediaframereader-to-get-time-corellated-frames-from-multiple-sources"></a>使用 MultiSourceMediaFrameReader 從多個來源取得與時間相互關聯的畫面
從 Windows 10 版本 1607 開始,您可以使用 [**MultiSourceMediaFrameReader**](/uwp/api/windows.media.capture.frames.multisourcemediaframereader),從多個來源接收與時間相互關聯的畫面。 此 API 可讓您更容易進行需要來自多個拍攝時間相近來源之畫面的處理,例如使用 [**DepthCorrelatedCoordinateMapper**](/uwp/api/windows.media.devices.core.depthcorrelatedcoordinatemapper) 類別。 使用這個新方法有一項限制,就是只能以最慢擷取來源的速率來引發畫面已到達事件。 將會捨棄較快速來源的額外畫面。 此外,由於系統預期畫面以不同的速率從不同的來源抵達,因此也無法自動辨識來源是否已完全停止產生畫面。 本節的範例程式碼示範如何使用事件建立您自己的逾時邏輯,只要相互關聯的畫面未在應用程式定義的時間限制內抵達,就會叫用此邏輯。
使用 [**MultiSourceMediaFrameReader**](/uwp/api/windows.media.capture.frames.multisourcemediaframereader) 的步驟和使用本文之前所述 [**MediaFrameReader**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameReader) 的步驟相似。 此範例將會使用色彩來源和深度來源。 宣告一些字串變數來儲存媒體畫面來源識別碼,這些識別碼將用來選取每個來源的畫面。 接下來,宣告用於實作範例逾時邏輯的 [**ManualResetEventSlim**](/dotnet/api/system.threading.manualreseteventslim)、[**CancellationTokenSource**](/dotnet/api/system.threading.cancellationtokensource) 和 [**EventHandler**](/dotnet/api/system.eventhandler)。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMultiFrameDeclarations":::
使用本文先前所述的技術,查詢包含此範例案例所需色彩及深度來源的 [**MediaFrameSourceGroup**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceGroup)。 選取所要的畫面來源群組之後,取得每個畫面來源的 [**MediaFrameSourceInfo**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSourceInfo)。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetSelectColorAndDepth":::
將選取的畫面來源群組傳入初始化設定,建立和初始化 **MediaCapture** 物件。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMultiFrameInitMediaCapture":::
初始化 **MediaCapture** 物件後,擷取彩色攝影機和景深相機的 [**MediaFrameSource**](/uwp/api/Windows.Media.Capture.Frames.MediaFrameSource) 物件。 儲存每個來源的識別碼,這樣您就可以選取對應來源的到達畫面。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetGetColorAndDepthSource":::
呼叫 [**CreateMultiSourceFrameReaderAsync**](/uwp/api/windows.media.capture.mediacapture.createmultisourceframereaderasync) 並傳遞讀取器使用的畫面來源陣列,以建立和初始化 **MultiSourceMediaFrameReader**。 註冊 [**FrameArrived**](/uwp/api/windows.media.capture.frames.multisourcemediaframereader.FrameArrived) 事件的事件處理常式。 此範例會建立本文之前所述 **FrameRenderer** 協助程式類別的執行個體,將畫面呈現至 **Image** 控制項。 呼叫 [**StartAsync**](/uwp/api/windows.media.capture.frames.multisourcemediaframereader.StartAsync) 以啟動畫面讀取器。
註冊範例先前宣告之 **CorellationFailed** 事件的事件處理常式。 我們會通知這個事件,正在使用的其中一個畫面來源是否停止產生畫面。 最後,呼叫 [**Task.Run**](/dotnet/api/system.threading.tasks.task.run#System_Threading_Tasks_Task_Run_System_Action_) 以呼叫不同執行緒上的逾時協助程式方法 **NotifyAboutCorrelationFailure**。 本文稍後會示範此方法的實作。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetInitMultiFrameReader":::
每當所有由 **MultiSourceMediaFrameReader** 管理的媒體畫面來源有新的畫面時,都會引發 **FrameArrived** 事件。 這表示將會按照最慢媒體來源的頻率引發事件。 如果某個來源在較慢來源產生一個畫面的同時產生多個畫面,將會捨棄該快速來源的額外畫面。
呼叫 [**TryAcquireLatestFrame**](/uwp/api/windows.media.capture.frames.multisourcemediaframereader.TryAcquireLatestFrame) 以取得與事件相關聯的 [**MultiSourceMediaFrameReference**](/uwp/api/windows.media.capture.frames.multisourcemediaframereference)。 呼叫 [**TryGetFrameReferenceBySourceId**](/uwp/api/windows.media.capture.frames.multisourcemediaframereference.trygetframereferencebysourceid) 並傳入初始化畫面讀取器時所儲存的識別碼字串,以取得與每個媒體畫面來源相關聯的 **MediaFrameReference**。
呼叫 **ManualResetEventSlim** 物件的 [**Set**](/dotnet/api/system.threading.manualreseteventslim.set#System_Threading_ManualResetEventSlim_Set) 方法以發出畫面已到達的通知訊號。 我們將會在執行於不同執行緒的 **NotifyCorrelationFailure** 方法中檢查這個事件。
最後,在與時間相互關聯的媒體畫面上執行任何處理。 此範例只是顯示來自深度來源的畫面。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMultiFrameArrived":::
啟動畫面讀取器後,**NotifyCorrelationFailure** 協助程式方法已在不同的執行緒中執行。 在此方法中,檢查是否發出了畫面已收到事件的通知訊號。 請記住,在 **FrameArrived** 處理常式中,只要有一組相互關聯的畫面到達,我們就設定此事件。 如果事件有一段應用程式定義的時間 (5 秒為合理值) 沒有收到通知訊號,而且使用 **CancellationToken** 並未取消工作,那麼很可能其中一個媒體畫面來源已停止讀取畫面。 在這種情況下,您通常需要關閉畫面讀取器,以便引發應用程式定義的 **CorrelationFailed** 事件。 在此事件的處理常式中,您可以依照本文先前所述的方式,停止畫面讀取器並清理其相關聯的資源。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetNotifyCorrelationFailure":::
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetCorrelationFailure":::
## <a name="use-buffered-frame-acquisition-mode-to-preserve-the-sequence-of-acquired-frames"></a>使用緩衝的畫面擷取模式來保留取得框架的順序
從 Windows 10 版本 1709 開始,您可以設定 **MediaFrameReader** 或 **MultiSourceMediaFrameReader** 的 **[AcquisitionMode](/uwp/api/windows.media.capture.frames.mediaframereader.AcquisitionMode)** 屬性為 **Buffered**,來保留從畫面來源傳遞至您應用程式的畫面順序。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetSetBufferedFrameAcquisitionMode":::
在預設擷取模式 **Realtime** 中,如果您的應用程式仍在處理先前畫面的 **FrameArrived** 時從來源取得多個畫面,系統會將最近取得的畫面傳送給您的應用程式,並捨棄緩衝中等待的其他畫面。 這樣可隨時提供最新的可用畫面給您的應用程式。 這通常是適用於即時視覺電腦應用程式的最有用模式。
在 **Buffered** 擷取模式中,系統會將所有畫面保留在緩衝中,並透過 **FrameArrived** 事件依收到的順序將這些畫面提供給您的應用程式。 請注意,在此模式下,系統的畫面緩衝區已滿時,系統將會停止取得新畫面,直到您的應用程式完成先前畫面的 **FrameArrived** 事件,釋放緩衝區的更多空間。
## <a name="use-mediasource-to-display-frames-in-a-mediaplayerelement"></a>使用 MediaSource 在 MediaPlayerElement 中顯示畫面
從 Windows 版本 1709 開始,您可以在 XAML 頁面的 **[MediaPlayerElement](/uwp/api/windows.ui.xaml.controls.mediaplayerelement)** 控制項中直接顯示從 **MediaFrameReader** 取得的畫面。 這是透過使用 **[MediaSource.CreateFromMediaFrameSource](/uwp/api/windows.media.core.mediasource.createfrommediaframesource)** 來建立 **[MediaSource](/uwp/api/windows.media.core.mediasource)** 物件 (可供 **MediaPlayerElement** 相關 **[MediaPlayer](/uwp/api/windows.media.playback.mediaplayer)** 直接使用) 來達成。 如需使用 **MediaPlayer** 與 **MediaPlayerElement** 的詳細資訊,請參閱[使用 MediaPlayer 播放音訊和視訊](play-audio-and-video-with-mediaplayer.md)。
下列程式碼範例顯示在 XAML 頁面上同時顯示前方和後方相機畫面的簡單實作。
首先,新增兩個 **MediaPlayerElement** 控制項到您的 XAML 頁面。
:::code language="xml" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml" id="SnippetMediaPlayerElement1XAML":::
:::code language="xml" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml" id="SnippetMediaPlayerElement2XAML":::
接下來,使用本文先前章節所述的技術,為前端面板和背面面板上的彩色相機選取包含 **MediaFrameSourceInfo** 物件的 **MediaFrameSourceGroup**。 請注意,**MediaPlayer** 不會自動將畫面的非色彩格式 (例如景深或紅外線資料) 轉換成色彩資料。 使用其他感應器類型可能會產生無法預期的結果。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMediaSourceSelectGroup":::
初始化 **MediaCapture** 物件,以使用所選的 **MediaFrameSourceGroup**。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMediaSourceInitMediaCapture":::
最後,使用相關 **MediaFrameSourceInfo** 物件的 **[Id](/uwp/api/windows.media.capture.frames.mediaframesourceinfo.Id)** 屬性來選取 **MediaCapture** 物件之 **[FrameSources](/uwp/api/windows.media.capture.mediacapture.FrameSources)** 集合中的其中一個畫面來源,以呼叫 **[MediaSource.CreateFromMediaFrameSource](/uwp/api/windows.media.core.mediasource.createfrommediaframesource)** 來為每個畫面來源建立 **MediaSource**。 透過呼叫 **[SetMediaPlayer](/uwp/api/windows.ui.xaml.controls.mediaplayerelement.MediaPlayer)**,初始化新的 **MediaPlayer** 物件,並將它指派給 **MediaPlayerElement**。 然後設定 **[Source](/uwp/api/windows.media.playback.mediaplayer.Source)** 屬性為新建立的 **MediaSource** 物件。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetMediaSourceMediaPlayer":::
## <a name="use-video-profiles-to-select-a-frame-source"></a>使用視訊設定檔來選取畫面來源
以 [**MediaCaptureVideoProfile**](/uwp/api/Windows.Media.Capture.MediaCaptureVideoProfile) 物件表示的相機設定檔,代表特定裝置提供的一組功能,例如畫面播放速率、解析度或 HDR 擷取等進階功能。 擷取裝置可能支援多個設定檔,讓您可以選取最適合您擷取案例的設定檔。 從 Windows 10 版本 1803 開始,在初始化 **MediaCapture** 物件前,您可以使用 **MediaCaptureVideoProfile** 來選取具有特殊功能的媒體畫面來源。 下列範例方法會尋找支援寬色域 (WCG) HDR 的視訊設定檔,並傳回 **MediaCaptureInitializationSettings** 物件,可用來初始化 **MediaCapture** 以使用所選裝置和設定檔。
首先,呼叫 [**MediaFrameSourceGroup.FindAllAsync**](/uwp/api/windows.media.capture.frames.mediaframesourcegroup.findallasync) 以取得目前裝置上可用的所有媒體畫面來源群組清單。 循環顯示每個來源群組並呼叫 [**MediaCapture.FindKnownVideoProfiles**](/uwp/api/windows.media.capture.mediacapture.findknownvideoprofiles) 以取得支援所指定設定檔之目前來源群組的所有視訊設定檔清單,在此例中是 HDR 含 WCG 相片。 如果找到符合條件的設定檔,則建立新的 **MediaCaptureInitializationSettings** 物件,並將 **VideoProfile** 設定為所選設定檔以及將 **VideoDeviceId** 設定為目前媒體畫面來源群組的 **Id** 屬性。
:::code language="csharp" source="~/../snippets-windows/windows-uwp/audio-video-camera/Frames_Win10/cs/Frames_Win10/MainPage.xaml.cs" id="SnippetGetSettingsWithProfile":::
如需使用相機設定檔的詳細資訊,請參閱[相機設定檔](camera-profiles.md)。
## <a name="related-topics"></a>相關主題
* [相機](camera.md)
* [使用 MediaCapture 進行基本相片、視訊和音訊的擷取](basic-photo-video-and-audio-capture-with-MediaCapture.md)
* [相機畫面範例](https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/CameraFrames)
| 98.473077 | 1,566 | 0.808108 | yue_Hant | 0.946011 |
fa1b9fb5588e5b8464d12ae511717b5ce8e8859e | 292 | md | Markdown | README.md | rcorpin/tooltip-buttons-sample | 8817d789017d79833cc51e93c8747821bc918018 | [
"MIT"
] | null | null | null | README.md | rcorpin/tooltip-buttons-sample | 8817d789017d79833cc51e93c8747821bc918018 | [
"MIT"
] | null | null | null | README.md | rcorpin/tooltip-buttons-sample | 8817d789017d79833cc51e93c8747821bc918018 | [
"MIT"
] | null | null | null | # AngularJS Tooltip Buttons
Career Cruising Front-End Engineering Test
## Installation
1. Run `npm install` to install the dependencies and create the minified files for the client using grunt
2. Run `npm start` to start the server
3. Go to `localhost:8000/myApp` to view the angular app
| 26.545455 | 105 | 0.777397 | eng_Latn | 0.979421 |
fa1ba4f0f9e363dfe6c41500ba01e110123cc6b5 | 304 | md | Markdown | CONTRIBUTORS.md | yungone/neuron-finding | 00ccbdb291b0128714f75522c9954c2eaea2808a | [
"MIT"
] | null | null | null | CONTRIBUTORS.md | yungone/neuron-finding | 00ccbdb291b0128714f75522c9954c2eaea2808a | [
"MIT"
] | null | null | null | CONTRIBUTORS.md | yungone/neuron-finding | 00ccbdb291b0128714f75522c9954c2eaea2808a | [
"MIT"
] | 2 | 2019-05-02T00:58:41.000Z | 2019-05-08T21:22:00.000Z | ## Contributor
#### Marcus Hill (marcdh@uga.edu)
- Data preprocssiong
- CNN model
- Documentation
#### Narinder Singh (narindersingh.ghumman@uga.edu)
- ETHICS research and writing
- Documentation
#### Jiahao Xu (jiahaoxu@uga.edu)
- Data visualization
- Data preprocessing
- NMF model
- Documentations
| 17.882353 | 51 | 0.740132 | yue_Hant | 0.501563 |
fa1c7b2dc1711c14aa0fde11d4881973878f1450 | 120 | md | Markdown | SUMMARY.md | johdah/LibrariesCollection_Book | 2a93c8bf52fabf59351f97f7a52e60f3cf38ddde | [
"CC0-1.0"
] | 1 | 2021-10-07T03:10:48.000Z | 2021-10-07T03:10:48.000Z | SUMMARY.md | johdah/LibrariesCollection_Book | 2a93c8bf52fabf59351f97f7a52e60f3cf38ddde | [
"CC0-1.0"
] | null | null | null | SUMMARY.md | johdah/LibrariesCollection_Book | 2a93c8bf52fabf59351f97f7a52e60f3cf38ddde | [
"CC0-1.0"
] | null | null | null | # Summary
* [Java](java/README.md)
* [Javascript/Node.js](javascript/README.md)
* [Stylesheets](stylesheets/README.md)
| 20 | 44 | 0.725 | yue_Hant | 0.931638 |
fa1cf480dceb9c8d4df0191573a61afc83523763 | 338 | md | Markdown | content/post/working-as-sub-reviewer-for-complex-networks-2021/index.md | farhantanvir1/starter-hugo-academic-Farhan | ecedd44ce18ac64fe78d430a25cc637ca747054a | [
"MIT"
] | null | null | null | content/post/working-as-sub-reviewer-for-complex-networks-2021/index.md | farhantanvir1/starter-hugo-academic-Farhan | ecedd44ce18ac64fe78d430a25cc637ca747054a | [
"MIT"
] | null | null | null | content/post/working-as-sub-reviewer-for-complex-networks-2021/index.md | farhantanvir1/starter-hugo-academic-Farhan | ecedd44ce18ac64fe78d430a25cc637ca747054a | [
"MIT"
] | null | null | null | ---
title: Working as sub-reviewer for Complex Networks 2021
date: 2021-09-16T19:55:38.627Z
draft: false
featured: false
image:
filename: featured
focal_point: Smart
preview_only: false
---
I have the opportunity to work as sub-reviewer for Complex Networks 2021. I will earnestly try to review paper with my knowledge and skillset. | 30.727273 | 142 | 0.778107 | eng_Latn | 0.98097 |
fa1d085960b00054323bdad3aa1ec64ccefaf265 | 477 | md | Markdown | server/node_modules/transmitter/README.md | Atanasov86/Car-System | 0928107f9f36cf1815552f3b63b1fca951439ed7 | [
"MIT"
] | 1 | 2019-02-24T07:52:34.000Z | 2019-02-24T07:52:34.000Z | server/node_modules/transmitter/README.md | Atanasov86/Car-System | 0928107f9f36cf1815552f3b63b1fca951439ed7 | [
"MIT"
] | null | null | null | server/node_modules/transmitter/README.md | Atanasov86/Car-System | 0928107f9f36cf1815552f3b63b1fca951439ed7 | [
"MIT"
] | null | null | null | # transmitter
> Dead simple pub-sub
## API
### subscribe(onChange: () => any): { dispose: () => void }
Subscribes to change events. Returns an object which contains the method `dispose` which removes the current subscription.
### publish(...payload: any): void
Emit a change to all the subscribers.
## Example
```js
const bus = transmitter()
bus.subscribe(result => console.log(result))
bus.publish({ foo: 'bar' })
```
## License
[MIT](http://josh.mit-license.org)
| 17.035714 | 122 | 0.677149 | eng_Latn | 0.968407 |
fa1d165afc17879652d12d5fc5a1ec80b17dbfae | 3,592 | md | Markdown | commercial/GA1/1.0commerical.services-swift-deployment-add-disk-storage-node.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | commercial/GA1/1.0commerical.services-swift-deployment-add-disk-storage-node.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | commercial/GA1/1.0commerical.services-swift-deployment-add-disk-storage-node.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | ---
layout: 1.0default
title: "HP Helion OpenStack® 1.0: Add New Scale-out Object Node"
permalink: /helion/openstack/services/swift/deployment/add-disk-object-node/
product: commercial.ga1.0
---
<!--PUBLISHED-->
<script>
function PageRefresh {
onLoad="window.refresh"
}
PageRefresh();
</script>
<!--
<p style="font-size: small;"> <a href=" /helion/openstack/services/object/swift/expand-cluster/">◀ PREV</a> | <a href=" /helion/openstack/services/object/swift/expand-cluster/">▲ UP</a> | <a href=" /helion/openstack/services/object/swift/Monitor-cluster/"> NEXT ▶</a> </p>
-->
# HP Helion OpenStack® 1.0: Add New Scale-out Object Node
[See the Helion OpenStack 1.1 version of this page](/helion/openstack/1.1/services/swift/deployment/add-disk-object-node/)
Perform the following procedure to add new scale-out object node.
1. [Prerequisites](#preq)
2. [Deploying new object nodes](#deploy-new-object-node)
3. [Adding node and disks to object-ring:1](#add-disk-node)
##Prerequisite {#preq}
* HP Helion OpenStack® cloud is successfully deployed.<br /> *(Starter Swift nodes are functional by default as they are part of cloud deployment)*
* Scale-out object-ring:1 is deployed.
##Deploying new object nodes {#deploy-new-object-node}
* Perform the steps mentioned in [Provision Swift Node(s)]( /helion/openstack/services/swift/provision-nodes/) to deploy a new node.
## Adding nodes and disks to object-ring:1 {#add-disk-node}
Once the Swift nodes are deployed, the required disks must be formatted and mounted them before adding them to the Swift cluster.
1. Log in to the undercloud from the seed.
# ssh heat-admin@<undercloud IP address>
# sudo -i
2. Change the working directory to the ring building directory.
# cd /root/ring-building
3. List the available scale-out Swift nodes and identify the newly-created node(s).
# ringos list-swift-nodes -t object
4. List the disks available on the node.
# ringos list-disks -n <object nodes IP address>
5. Format the given disk.
# ringos format-disks -n <object nodes IP address> -d <disk>
**Note**: You can format all the disks with a single command: `-d --all`.
6. List the files in the ring building directory and identify the `object-1.builder` file.
7. Add the formatted disk(s) to the object-1 ring.
# ringos add-disk-to-ring -f /root/ring-building/object-1.builder -i <Object nodes IP address> -p <port> -d <disk label> -w <weight> -r <region> -z <zone>
**Recommendation**: Add drives gradually using a weighted approach to avoid degraded performance of the Swift cluster. The weight will gradually increase by 25% until it reaches 100%. The initial weight is 25.
8. Re-balance the object-1 ring.
# ringos rebalance-ring -f /root/ring-building/object-1.builder
**Note**: You must wait for the length of time specified in `min_part_hours` before another re-balance succeeds.
9. List all the Swift nodes.
# ringos list-swift-nodes -t all
10. Copy `object-1.ring.gz` file to all the nodes.
# ringos copy-ring -s /root/ring-building/object-1.ring.gz -n <Swift nodes IP address>
11. Set the weight of the disks using the following command:
# ringos set-weight -f /root/ring-building/object-1.builder -s <object node IP address> -w <weight>
12. Repeat steps from **8-11** as necessary, increasing the weight by 25 each time. [Change the weight to 50, then 75, and then 100 (w= 50, 75, 100).]
<a href="#top" style="padding:14px 0px 14px 0px; text-decoration: none;"> Return to Top ↑ </a>
----
| 34.538462 | 289 | 0.716036 | eng_Latn | 0.931641 |
fa1de88c594ec3802e31a72fed13f31e2faa8b65 | 1,071 | md | Markdown | README.md | hziling/ORM | 70ce293ee447f76ae1f8f1b9d9f67ab4bfc51765 | [
"MIT"
] | null | null | null | README.md | hziling/ORM | 70ce293ee447f76ae1f8f1b9d9f67ab4bfc51765 | [
"MIT"
] | null | null | null | README.md | hziling/ORM | 70ce293ee447f76ae1f8f1b9d9f67ab4bfc51765 | [
"MIT"
] | null | null | null | ORM
===
ORM for sqlite3 like Django ORM.
Usage:
-----
from datetime import datetime
import database
db = database.Sqlite('blog.db')
class Post(db.Model):
title = database.CharField(20)
content = database.TextField()
created_time = database.DateTimeField()
db.create_table(Post)
post = Post(title='post title', content='post content', created_time=datetime.now())
post.save()
post.id, post.title, post.content
Out: (5, 'post title', 'post content', datetime.datetime(2016, 1, 6, 17, 25, 37, 342000))
print Post.select().where(id=5).all()
Out: [<Post post title>]
The ManyToManyField just like Django ManyToManyField:
class Tag(db.Model):
name = database.CharField(50)
posts = database.ManyToManyField(Post)
When create table from class `Tag`, ORM will auto-create a table `post_tag` which referenced `Post` and `Tag`.
We can add tag to the post like this:
tag = Tag(name='tag')
tag.save()
post.tags.add(tag)
post.tags.all()
Out: [<Tag tag>] | 25.5 | 110 | 0.644258 | eng_Latn | 0.572916 |
fa1e209e878b7fe31207a0b336ef12bc181d1839 | 3,251 | md | Markdown | docs/overview/angel_graph_sona.md | Jennifer88huang/angel | 5a5c11c614c2f3777c5ddbedd2a9272290513f56 | [
"Apache-2.0"
] | 3,262 | 2018-08-27T08:04:50.000Z | 2022-03-29T15:46:03.000Z | docs/overview/angel_graph_sona.md | Jennifer88huang/angel | 5a5c11c614c2f3777c5ddbedd2a9272290513f56 | [
"Apache-2.0"
] | 455 | 2018-08-27T03:09:25.000Z | 2022-03-29T03:05:12.000Z | docs/overview/angel_graph_sona.md | Jennifer88huang/angel | 5a5c11c614c2f3777c5ddbedd2a9272290513f56 | [
"Apache-2.0"
] | 861 | 2018-08-27T04:54:24.000Z | 2022-03-29T08:49:40.000Z | # Angel-Graph
如今,我们身处万物互连的复杂网络世界,人和人、人和物、物和物之间的关系也变得更加复杂多样化,现实中许多问题都可以抽象为图来表达,通过传统图挖掘、图表示学习和图神经网络等图技术,我们可以从海量关系结构的数据中挖掘丰富的信息,以弥补单点分析的不足,最终对金融支付、安全风控、推荐广告等诸多业务场景产生助力。
## 概览
Angel Graph吸收了Angel参数服务器以及Spark、PyTorch优势,使得传统图计算、图表示学习和图神经网络“三位一体”,实现了高性能、高可靠、易用的大规模分布式图计算框架。
Angel Graph有以下核心能力:
- 复杂异构网络。工业界的图数据组成复杂多样,数据规模往往具有十亿级顶点、百亿甚至千亿级边。Angel Graph通过Spark On Angel或Pytorch进行分布式训练,可以轻松支持十亿级顶点、千亿级边的大规模图计算。
- 端到端的图计算。工业界的大数据生态多为Spark、Hadoop。Angel Graph基于Spark On Angel的架构,可以无缝衔接Spark,以便利用Spark 的ETL能力,支持端到端的图学习。
- 传统图挖掘。支持十亿级顶点、千亿级边的传统图算法,如PageRank、Kcore分析节点重要性,Louvain进行社区发现等。提供对顶点的测度分析和丰富的图特征,以便应用到机器学习或推荐风控等业务模型中。
- 图表示学习。支持十亿级顶点,千亿级边的Graph Embedding算法,如LINE、Word2Vec等。
- 图神经网络。支持十亿级顶点,数百亿边的图神经网络算法,利用顶点或边上丰富的属性信息进行深度学习。
## 系统介绍
如下图所示,Angel Graph利用Spark和Pytorch,下层Spark Worker,上层为Angel参数服务。

Angel Graph中的Spark On Angel模块将Angel灵活的参数服务器插件式地赋能给原生Spark,为Spark提供了高效的数据存储/更新/共享服务,因而非常适用于分布式图计算框架。同时Spark On Angel又沿用了原生Spark的变成接口,使得在该框架上的算法开发可以利用Spark的能力。
#### Spark组件
- Spark Driver:负责控制整体计算逻辑
- Spark Executor:在进行传统图计算和图表示学习时,Spark Executor**存储图邻接表/边表**等不可变的数据结构,在每轮迭代中从PS拉取(pull)所需的节点属性等数据,在本地完成计算后将结果推送(push)回PS,交由PS进行更新。在图神经网络算法训练时,Pytorch C++后端作为实际的计算引擎以native的方式运行在Spark Executor中
#### Angel组件
- Angel Master:管理参数服务器的生命周期
- Angel PS(Parameter Server):将节点属性等可变数据以向量的抽象形式储存(**支持存储自定义的元素数据结构,支持负载均衡分区**),通过Angel特有的PS函数实现节点属性in-place更新及其他根据特定算法定制的灵活计算
- Angel Agent:作为代理桥接Spark Executor和Parameter Server
#### Pytorch组件
- Python Client:利用TorchScript语法编写算法模型,交给Spark Executor加载,并通过Angel PS完成模型的分布式训练和预测。
## 内置图算法
为了方便使用图计算框架,我们实现了许多常用图算法,并且在内部业务中进行了充分测试,保证了算法的运行效率和正确性。用户使用时无需过多调整,便可快速使用。
| 算法名称 | 算法类型 | 说明 |
| -------------------------- | -------------- | ------------------------------------- |
| PageRank | 节点重要性计算 | 经典的传统图算法 |
| Hindex | 节点重要性计算 | 混合量化指标,参考H指数 |
| Kcore | 节点特征 | 提取网络中关键子结构 |
| Louvain | 社区发现 | 通过优化模块度指标达到社区划分的目的 |
| Closeness | 接近中心性 | 度量顶点在图中的中心程度 |
| CommonFriends | 共同好友计算 | 计算两个顶点的共同好友数 |
| TriangleCountingUndirected | 三角计数 | 计算每个顶点所在的三角结构个数 |
| LPA | 标签传播 | 一种社区发现或传播算法 |
| ConnectedComponents | 弱连通分量 | 挖掘图的弱连通分量 |
| LINE | 表示学习 | 可利用1阶,2阶邻居信息进行表示学习 |
| Word2Vec | 表示学习 | 一种经典的表征学习算法 |
| GraphSage | 图神经网络算法 | 通过聚合节点邻居的特征进行表示学习 |
| GCN | 图神经网络算法 | 类似CNN操作,并应用到图非欧空间的算法 |
| DGI | 图神经网络算法 | DIM应用到复杂网络领域 |
## BenchMark性能测试
我们在两份真实的数据集下对比Graphx和Angel Graph的性能,其中DS1为8亿顶点,110亿边金融支付网络;DS2为20亿顶点,1400亿边的社交网络,性能测试结果如下所示:

有关Angel Graph的详细介绍,可以参考这篇论文 [PSGraph: How Tencent trains extremely large-scale graphs with Spark?](https://conferences.computer.org/icde/2020/pdfs/ICDE2020-5acyuqhpJ6L9P042wmjY1p/290300b549/290300b549.pdf)
| 41.679487 | 206 | 0.646878 | yue_Hant | 0.721733 |
fa1edfb73ce93054fe32d4eb35a5c4bee68c5bf5 | 5,880 | md | Markdown | ARCHITECTURE.md | jacoblockard99/rbspy | e97de41a4c708447a5e1452ec8918791ef1620e1 | [
"MIT"
] | null | null | null | ARCHITECTURE.md | jacoblockard99/rbspy | e97de41a4c708447a5e1452ec8918791ef1620e1 | [
"MIT"
] | null | null | null | ARCHITECTURE.md | jacoblockard99/rbspy | e97de41a4c708447a5e1452ec8918791ef1620e1 | [
"MIT"
] | null | null | null | # rbspy architecture
rbspy is a little complicated. I want other people to be able to contribute to it easily, so here is
an architecture document to help you understand how it works.
Here’s what happens when you run `rbspy snapshot --pid $PID`. This is the simplest subcommand (it takes a
PID and gets you the current stack trace from that PID), and if you understand how `snapshot` works
you can relatively easily understand how the rest of the `rbspy` subcommands work as well.
The implementation of the `snapshot` function in `main.rs` is really simple: just 6 lines of code.
The goal of this document is to explain how that code works behind the scenes.
```
fn snapshot(pid: pid_t) -> Result<(), Error> {
let getter = initialize::initialize(pid)?;
let trace = getter.get_trace()?;
for x in trace.iter().rev() {
println!("{}", x);
}
Ok(())
}
```
## Phase 1: Initialize. (`initialize.rs` + `address_finder.rs`)
Our first goal is to create a struct (`StackTraceGetter`) which we can call `.get()` on to get a
stack trace. This struct contains a PID, a function, and the address in the target process of the
current thread. The initialization code is somewhat complicated but has a simple interface: you give
it a PID, and it returns a struct that you can call `.get_trace()` on:
```
let getter = initialize.initialize(pid)
getter.get_trace()
```
Here's what happens when you call `initialize(pid)`.
**Step 1**: **Find the Ruby version of the process**. The code to do this is in a function called
`get_ruby_version`.
**Step 2**: **Find the address of the `ruby_current_thread` global variable**. This address is the
starting point for getting a stack trace from our Ruby process -- we start there every time. How we do
this depends on 2 things -- whether the Ruby process we’re profiling has symbols, and the Ruby
version (in 2.5.0+ there are some small differences).
If there are symbols, we find the address of the current thread using the symbol table.
(`current_thread_address_location_symbol_table` function). This is pretty straightforward. We look
up `ruby_current_thread` or `ruby_current_execution_context_ptr` depending on the Ruby version.
If there **aren’t** symbols, instead we use a heuristic
(`current_thread_address_location_search_bss`) where we search through the `.bss` section of our
binary’s memory for something that plausibly looks like the address of the current thread. This
assumes that the address we want is in the `.bss` section somewhere. How this works:
* Find the address of the `.bss` section and read it from memory
* Cast the `.bss` section to an array of `usize` (so an array of addresses).
* Iterate through that array and for every address run the `is_maybe_thread` function on that
address. `is_maybe_thread` is a Ruby-version-specific function (we compile a different version of
this function for every Ruby version). We'll explain this later.
* Return an address if `is_maybe_thread` returns true for any of them. Otherwise abort.
**Step 3**: **Get the right `stack_trace` function**. We compile 30+ different functions to get
stack_traces (will explain this later). The code to decide which function to use is basically a huge
switch statement, depending on the Ruby version.
```
"1.9.1" => self::ruby_1_9_1_0::get_stack_trace,
"1.9.2" => self::ruby_1_9_2_0::get_stack_trace,
"1.9.3" => self::ruby_1_9_3_0::get_stack_trace,
```
**Step 4**: **Return the `getter` struct**.
Now we're done! We return our `StackTraceGetter` struct.
```
pub fn initialize(pid: pid_t) -> Result<StackTraceGetter, Error> {
let version = get_ruby_version_retry(pid).context("Couldn't determine Ruby version")?;
debug!("version: {}", version);
Ok(StackTraceGetter {
pid: pid,
current_thread_addr_location: os_impl::current_thread_address(pid, &version)?,
stack_trace_function: stack_trace::get_stack_trace_function(&version),
})
}
impl StackTraceGetter {
pub fn get_trace(&self) -> Result<Vec<StackFrame>, MemoryCopyError> {
let stack_trace_function = &self.stack_trace_function;
stack_trace_function(self.current_thread_addr_location, self.pid)
}
}
```
## Phase 2: Get stack traces (`ruby_version.rs`, `ruby-bindings/` crate, `bindgen.sh`)
Once we've initialized, all that remains is calling the `get_trace` function. How does that function
work?
Like we said before -- we compile a different version of the code to get stack traces for every Ruby
version. This is because every Ruby version has slightly different struct layouts.
The Ruby structs are defined in a `ruby-bindings` crate. All the code in that crate is autogenerated
by bindgen, using a hacky script called `bindgen.sh`.
These functions are defined through a bunch of macros (4 different macros, for different ranges of
Ruby versions) which implement `get_stack_trace` for every Ruby version. Each one uses the right
Ruby.
There's a lot of code in `ruby_version.rs` but this is the core of how it works. First, it defines a
`$ruby_version` module and inside that module uses `bindings::$ruby_version` which includes all the
required struct definitions for that Ruby version.
Then it includes **more** macros which together make up the body of that module. This is because
some functions are the same across all Ruby versions (like `get_ruby_string`) and some are different
(like `get_stack_frame` which changes frequently because the way Ruby organizes that code changes a
lot).
```
macro_rules! ruby_version_v_2_0_to_2_2(
($ruby_version:ident) => (
pub mod $ruby_version {
use bindings::$ruby_version::*;
...
get_stack_trace!(rb_thread_struct);
get_ruby_string!();
get_cfps!();
get_lineno_2_0_0!();
get_stack_frame_2_0_0!();
is_stack_base_1_9_0!();
}
```
| 44.210526 | 105 | 0.735714 | eng_Latn | 0.994498 |
fa20509e0d6c8e22fbf3183f6eb68ed819bbdf88 | 71 | md | Markdown | src/e/evalCss.i18n.md | WilcoBreedt/licia | 3b0249c0ceffce8e70c31737d98c543cbcd9827a | [
"MIT"
] | null | null | null | src/e/evalCss.i18n.md | WilcoBreedt/licia | 3b0249c0ceffce8e70c31737d98c543cbcd9827a | [
"MIT"
] | null | null | null | src/e/evalCss.i18n.md | WilcoBreedt/licia | 3b0249c0ceffce8e70c31737d98c543cbcd9827a | [
"MIT"
] | null | null | null | ## CN
加载 css 到页面中。
|参数名|类型|说明|
|-----|----|---|
|css|string|css 代码|
| 7.888889 | 19 | 0.450704 | yue_Hant | 0.368742 |
fa2092607bee2237b405a268dc1e0a03d57ebb95 | 1,375 | md | Markdown | cross-powerpc-morphos/README.md | claunia/cross-docks | c5a7187a5816563c24fe00b8bf0b351e8cfe3984 | [
"MIT"
] | null | null | null | cross-powerpc-morphos/README.md | claunia/cross-docks | c5a7187a5816563c24fe00b8bf0b351e8cfe3984 | [
"MIT"
] | null | null | null | cross-powerpc-morphos/README.md | claunia/cross-docks | c5a7187a5816563c24fe00b8bf0b351e8cfe3984 | [
"MIT"
] | null | null | null | # Cross compilation environment
| | |
|--------------------------:|:--------------------------------------------------|
| **Compiler:** | 9.2.0 |
| **Target architecture:** | PowerPC |
| **Target OS:** | MorphOS |
| **AS:** | `/opt/ppc-morphos/bin/ppc-morphos-as` |
| **LD:** | `/opt/ppc-morphos/bin/ppc-morphos-ld` |
| **AR:** | `/opt/ppc-morphos/bin/ppc-morphos-ar` |
| **CC:** | `/opt/ppc-morphos/bin/ppc-morphos-gcc` |
| **CXX:** | `/opt/ppc-morphos/bin/ppc-morphos-g++` |
| **RANLIB:** | `/opt/ppc-morphos/bin/ppc-morphos-ranlib` |
| **CMake toolchain file:** | `/opt/ppc-morphos/lib/ppc-morphos.cmake` |
| **SSH daemon:** | *Yes* |
| **Username:** | `user` |
| **Password:** | `password` |
Installed using [Marlon Beijer's Docker image](https://hub.docker.com/layers/amigadev/crosstools/ppc-morphos).
| 68.75 | 110 | 0.323636 | eng_Latn | 0.103865 |
fa20f6c2615888c5d01538b7104bf5b773976815 | 159 | md | Markdown | src/api/Readme.md | merbla/RESTChess | 3a6b3a214d2242c60bda8bb179ca09cce369286b | [
"Apache-2.0"
] | null | null | null | src/api/Readme.md | merbla/RESTChess | 3a6b3a214d2242c60bda8bb179ca09cce369286b | [
"Apache-2.0"
] | null | null | null | src/api/Readme.md | merbla/RESTChess | 3a6b3a214d2242c60bda8bb179ca09cce369286b | [
"Apache-2.0"
] | null | null | null | # [REST Fest 2015](http://restfest.org/)
September 17-19th in Greenville, SC
Join the action on the [wiki](https://github.com/RESTFest/2015-Greenville/wiki).
| 31.8 | 80 | 0.742138 | yue_Hant | 0.44246 |
fa210728ae95c01faa9f54bdde180feb2e29aead | 10,895 | md | Markdown | Week_11/Theory/README.md | WebToLearn/trading-app | ff5583cf515a9e9cf4e3cd83838869b1adc777fe | [
"MIT"
] | 24 | 2018-09-25T20:29:17.000Z | 2021-04-17T19:45:41.000Z | Week_11/Theory/README.md | WebToLearn/trading-app | ff5583cf515a9e9cf4e3cd83838869b1adc777fe | [
"MIT"
] | 92 | 2019-10-08T08:25:52.000Z | 2022-02-20T08:10:47.000Z | Week_11/Theory/README.md | WebToLearn/trading-app | ff5583cf515a9e9cf4e3cd83838869b1adc777fe | [
"MIT"
] | 14 | 2019-03-09T09:43:51.000Z | 2021-12-14T06:09:08.000Z | # Week 11 - Automation Testing Introduction
**Performance Testing Core Activities**
**Overview**
This document provides a high-level introduction to the most common core activities involved in automation UI testing your applications and the systems that support those applications.
Automation testing is a complex activity that cannot effectively be shaped into a “one-size-fits-all” or even a “one-size-fits-most” approach. Projects, environments, business drivers, acceptance criteria, technologies, timelines, legal implications, and available skills and tools simply make any notion of a common, universal approach unrealistic.
Automation testing should handle all the activities which are very repetitive and should handle all the prone to mistake or prone to fail test cases from the functional testing suites. The test suites should have test cases which are very simple, the framework should be coded to respect the programming principles (DRY, KISS, etc.)
The output of this module for students is for them to be able to know why is automation necessary, what it does, have an understanding of what automation does and how it can handle the UI testing and to know the basic structure of a suite.
## Table of contents
1. [Why Implement Automated testing?](#why-implement-automated-testing?)
2. [What should we automate?](#what-should-we-automate?)
3. [Challenges when working with automated testing](#challenges-when-working-with-automated-testing)
4. [Tools and integrations](#Tools-and-integrations)
5. [Example of a most basic UI testing Script](#example-of-a-most-basic-ui-test)
## Why implement Automated testing?
- **To do repeating work the functional teams should do**
Automation comes in handy when there is a need for sanity checks and regression testing.
- ``Regression testing is the part of the functional testing which ensures the functionalities which have already been in place are not broken by the new code inserted through the development process``
- ``Sanity testing is an activity which checks that the main functionalities of the applications are in working. This is done as the first part of testing a new build.``
Automation is very good ally to the project team since it can handle very numerous test cases faster and reliable, every time in the same conditions without human interaction.
Automation framework can reliably run for many items the same scenarios with the same data, it can repeat the same action as to ensure that there are no changes to the system/application under test while having multiple activities repeated over and over again.
Automation is usually done to mimic the actions of average users through the systems and to get the functional testing teams the possibility of running more creative test scenarios as to be able to check more of the system/application under test.
Automation does not handle the testing of a system/application alone, it is a way for the functional team to have more time to creatively test a system/application and it also guarantees the testing team that the scenarios are done in the exact same way over and over again.
- **To have testing integrated in a CI/CD lifecycle**
Automated testing comes in very handy, alongside performance testing in a CI/CD lifecycle.
Usually in a CI/CD system automation and performance testing are used to validate a build for the next phases.
``For example: A new version is being deployed from a lower environment to an upper one, at first there will be a suite of unit tests (which will validate that the code and its components are in working conditions and there are no mistakes inserted into the build), then a Sanity check will commence and a Regression suite will follow (the Sanity will validate the main functionalities ad the Regression will validate that the full functionalities are working without issues) and after all the functional automated tests there will commence the Performance testing which will validate that the system/application is responding in the agreed parameters. After all the aforementioned activities are marked as **PASSED** the build is validated and can proceed to the next phase. (Besides Automation and Performance there ay also be some ther testing types being done, such as Penetration testing and other security-wise testings)``
## What should we automate?
There are a few key questions which help us decide what we automate:
- What can be done in the chosen programming language?
Any functionality from the system/application under test that can be interacted with (front end or back end) through a programming language should be automated so that the posibility of handling as many scenarios as possible is taken care of.
- What can be done without human/third party interactions
For this question the team should be able to asses what can be scenarios can be executed without human or third party interractions or how much of that given data can be mocked for it to ensure a valid testing approach.
``For example: if there is a need to take some input data as JSON from another application, how costly would it be to have it as an input object, how would it be created and processed before the test scenario is executed.``
- What repetitive actions/scenarios are there?
``For example, in our course application, we have a login page, we could login and log out for 100 times to validate that the system can successfully fulfil that task as many times as it is needed.``
For a functional tester these kind of testing scenarios are time consuming and prone to mistakes.(If there is an error on the 10th successful login and the tester doesn't type the correct password on the 9th try, he or she will have to restart the whole process, thus an automated suite is more reliable on this scenarios than any human).
## Challenges when working with automated testing
- With automated testing there can be many challenges, but most common ones are in the following list:
- Deciding what to automate from Front End and/or Back End
- Automation can handle UI tets, Back End tests or both, depending on how the systems are built, their architecture, interfacing applications, etc.
- Anything that is dynamic
- This mostly applies to the front end on which a more responsive application and a more dynamic UI can generate a lot of challenges for the automation team to overcome
``For example: UI interface may have a whole different locators for the same element when viewed from a tablet/desktop browser/mobile phone etc. (buttons with other locators or just different options available in the app)``
- Keeping it simple, small, robust and easy to maintain
- First of all, it is a very good way to have smaller tests than larger ones and it is very good to set the granularity level from the beginning, otherwise many problems can occur along the way
- After creating a suite a QA Engineer may encounter many changes of the system which need to be handled, changes in the workflow of the application/changes of the environment etc.
- The robustness of a test script is dictated by how the test can be successfully run or show a point of failure from the application and not have false positives or false negatives (pass despite a bug or fail without anything being wrong with the application)
## Tools and integrations
- Creating an automation framework requires usage of any tools, libraries and integrations with various other systems
**Tools**:
- For UI:
- Most used Selenium, but there are other tools, like protractor, cypress.io, Appium (for mobile apps) etc.
- Most UI tests require locators for the elements(id, classname, tag, id etc.)
- For Back End:
- WebServices - Postman (for webservices), custom built clients(Web Services), Restassured (WebServices and other APIs), database connectors etc.
- Programming language
- Here the choice is mostly dictated by the knowledge of the team or by how fast can the team learn a certain programming language and also by how well a certain programming language is suited for what needs to be done
- Software project management tool *Maven, Gradle, etc.
**Integrations**:
- Test Management tools
- Test Management tools help the QA Teams prepare testing scenarios and validate them
- Examples: HP ALM, Zephyr, DevTest, PractiTest, TestRail, TESTPAD, XRAY, HipTest etc.
- Reporting Tools:
- Reporting tools help us track issues (Bugs, tickets, tasks, etc.)
- Examples: JIRA, HP ALM, BugZilla, BugHerd, Microsoft Azure DevOps Server, DevTrack etc.
- Continuous Integration/Continuous Delivery
- The ability of an Automated Framework to be integrated into a pipeline for seamless CI/CD with tools like TeamCity, Jenkins etc. means that the tests can be run remotely and the results can be parsed in such a manner that will trigger the following step in that pipeline.
## Example of a most Basic UI Test
```java
package basicTests;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
public class basicTest {
public class test {
private WebDriver driver;
/**
* What is done before the actual test is run
* The annotation handles the actions that are done before all the methods with @Test are run
* A new instance of the WebDriver for Chrome is created and it is launched
*/
@BeforeClass
public void setup(){
//actual location of the webdriver executable
System.setProperty("webdriver.chrome.driver", "src\\main\\resources\\chromedriver.exe");
driver = new ChromeDriver();
}
/**
* Actual test case
* Steps:
* 1. go to the application
* 2. print the actual URL - can also have a check here to see that the browser is not redirected elsewhere
* 3. check the login button is present
*/
@Test
public void basicTest(){
driver.get("http://norulmeu.go.ro:8000/login");
System.out.println(driver.getCurrentUrl());
Assert.assertTrue(driver.findElement(By.xpath("//button[@type='submit']")).isDisplayed()
, "Login button is not shown in the Landing page");
}
/**
* This is what happens after the test is run
* The browser is closed, optional here you can have some sort of cleanup, delete some data that was created
* during the test or revert some changes so that other tests are not affected due to a possible test failure
*/
@AfterClass
public void tearDown(){
driver.quit();
//optional data clearance
}
}
}
```
| 67.253086 | 928 | 0.759982 | eng_Latn | 0.999401 |
fa21c92934c08a4e6530558bfb6a435a1362f48c | 7,106 | md | Markdown | Markdown/10000s/10000/message hopes.md | rcvd/interconnected-markdown | 730d63c55f5c868ce17739fd7503d562d563ffc4 | [
"MIT"
] | 2 | 2022-01-19T09:04:58.000Z | 2022-01-23T15:44:37.000Z | Markdown/00500s/04000/message hopes.md | rcvd/interconnected-markdown | 730d63c55f5c868ce17739fd7503d562d563ffc4 | [
"MIT"
] | null | null | null | Markdown/00500s/04000/message hopes.md | rcvd/interconnected-markdown | 730d63c55f5c868ce17739fd7503d562d563ffc4 | [
"MIT"
] | 1 | 2022-01-09T17:10:33.000Z | 2022-01-09T17:10:33.000Z | - Mad dirty in the most. Mary fruit of which the immediate. And and played moved said it. Seen who made the with as and. And suffer for and to race said.
- Jack the write sold incomplete performing. Coming attention natural de the had. To bed house his thus atmosphere. Down myself sent not his not.
- As conceived of between you the her though. Now in you count of place. Witness that Canada Mr walk. And and your come stronger the. Do us sure that the. At they from too that that. And the and and the hung tomorrow [[minds]]. And the explicit instantly be you swimming. [[hopes]] to Christian critical this. Of the he man castle. The loose learn passions upon he have. Received angle the new house be one. Their man her were youll. Good ladies exceptions who of which had. Purchase [[lifted wore]] that them me much. And no his with served alternate. The only was their his and. To this to as will in she. Old aunt proposition an to beat. Between princes of has the do charity himself. Be Gutenberg the or is illustrate. Southern all to strength of his never and. Have of preceding not his say young. Given of holding answer the he beloved. Situation it expected fatal she by of. Get distinct and day learn being. And such that anything back was be. Their in more [[hopes]] the. Is limited and same hard when and. Surprised sent within c found dean or the. They that so back about.
- Had short high all her. Changes their lines the of to. Their the the i cut of Rome. Rock up saluted of circle to not. You others have time six mentioned being. Said old in of their have. Anna days angel territory hill spirit refused to. Want generosity vice but the great. Breath in to of introduced. Principles his this talking he the whole. Me had pair but opposite i. Always establishment its also asked. The from commonplace and to often had asked. This a had sick benefit and. Of combination [[empty January]] word so the bidding quite. She him suffering it time head victim. There only left that in the their. Absence in Mrs foundations the made a the but. Poorly to tell like nothing she. Of was help and in any. We wonderful have were town. And on of record a of. To of time and from between creative set. Able will undertaken was dinner see evident. To clergyman air going general and. Falsehood withstand obtained the or themselves leaves. Ground quietly be the thought i. Their her upon my to meaning with and. With which estate on in and. Too even crush in. When could of objects. [[carrying]] upon understood walls i are they of. Battle his in that providing slightly its. Stay deemed and find. Evening was as the woman. One finally as de their preparing. Looked [[literature]] i the of business the. [[December]] this english parent the darted from enough. Something try man chap [[suffer]] [[dressed suffer]].
- You again them the in the i the. Flashed of and the being volume. They fruit although advantages might which the. Laughter not with Francis letters ear course on.
- Of rules of into of the. Stood am i to that the and and. On as have quick snatched represented i. Door times that her it the this other. Her them with two. Grand and fear been Mr ignorant was. Is Mary you had good to him. His he helmet the this in possible his. In all Gutenberg half make but of. Clothing already over to the for. Informed and Mr of physical cant glance. A some he ransom said not scanty no. From is could muttered second. In case shine north beside down destroyed. Sentence i enter this heavily the those very. Continued was persecution nor [[moments dressed]] be there.
- Some v person money will whether gold.
- Had inheritance speech that have and known.
- The in every not here Percy.
- For were of most which way his.
- Ye take the was liberty.
- Happened narrow hour gains i the.
- And carriages them returned some as often.
- How in Florida unless in wish good.
- Give listen them be the all.
- Her two her grew country in entered.
- Mrs me into for show the her.
- The said floor be drink any barrier.
- By our no well point to.
- From social cause in demand i city.
- We day built force over.
- Place brother and but done the unknown men.
-
- Around em or by seen not not. Of Fred the support [[suffer]] falling beginning. From he the rough some who. The wed examination as and [[lifted proposed]] the. Told the by five would afforded of me. Largely were which and conversation in. Fortress much the his exchanged you the. Provided the enough wings of did bed. Purposes secret Ive statements thus useful as he.
- Always always of as be his must. In in the complete formed our likewise. Pointed granted like your you that. I seal and the excitement or completely. Organization turned phrase the at so midst. It this of to or to. While said he the and affected. For you little stood the you. Most state said large in the by book. Is of been advantages in to this. Face wife would people to his the. Weighed the did in wasted over robert. Gives pestilence the shall afternoon bands power sun. [[phrase bay]] it the of about. Florence [[phrase welcome]] weeks said temper at before by. Visit is passages round will seized as house. Of be had different beam. By in and street cried ribbon. Eyes an fertility and mi to. The of old finished the of. And Christ blow northern by that to. To from parish prayed the acted but i. Embarked the the under. Were to might of work. With plan unkind the perils more. Mentioned worse enough Ireland only to. Had passed narrow of from. The the that to terrible system. Midst too as conferred to of copyright. Intervals fresh person with shows. Life there the in what. Asked taken of them ship hired.
- What i but think me it never what. [[laughing]] king host made altogether by act. Was my have investigation long of having. He place shoulder or many. While of it the soon. Cannon the bestowed destroyed most man. One said as most Michael than. In greatest of you warfare with on. It out in articles the i an. That silver strain one through surf at. Him throng it his had her. Not ferocity [[minds hopes]] be since. One they to been any in of station thought. Feel scourge in an be let. Their [[dressed]] that the aunt the i. Per now amount man or scarcely gradually of. For had are warning preservation succeeded beginning family. Is and ever subject went to. The but commerce the seem [[collection]]. The this to triumphant white. Had window now life hall the.
- Gods the of continued it i. Network lie are than Sophia with out he. Person suffer the nor comes. Of of forever rare the the they. Great be sides any the we and adoption. There but as roof show one. Be filled whether savings was it the. Wish the probably this them of and. Ask showing so who window. Such her wooden limited been insulted with. Name representations and of which her George Mr. The that up not meantime aware work. The the finger than boat we buy. Indulged her deadly came. The for to use in he him. Wish door than than that yet of. Suggestion beauty over heaven you and. Who and could it them on mighty. And those honest hatred the pension with. | 263.185185 | 1,427 | 0.767239 | eng_Latn | 0.999963 |
fa22b8c4aeaedeb717effd9654e4c7fefe50f3ad | 2,625 | md | Markdown | extractorTest/dn127039(v=exchg.80).md | bianliu1013/extractorTest | 27ea0e4f2657a8b36bb54ed486afcb7e80c4add0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | extractorTest/dn127039(v=exchg.80).md | bianliu1013/extractorTest | 27ea0e4f2657a8b36bb54ed486afcb7e80c4add0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | extractorTest/dn127039(v=exchg.80).md | bianliu1013/extractorTest | 27ea0e4f2657a8b36bb54ed486afcb7e80c4add0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Test MCA: Impossibile visualizzare le informazioni sulla disponibilità di un altro utente'
TOCTitle: 'Test MCA: Impossibile visualizzare le informazioni sulla disponibilità di un altro utente'
ms:assetid: 4ec87a51-fc6e-477f-8ccf-7520f64193fd
ms:mtpsurl: https://technet.microsoft.com/it-it/library/Dn127039(v=EXCHG.80)
ms:contentKeyID: 53089510
ms.date: 10/25/2013
mtps_version: v=EXCHG.80
_tocRel: dd439364(v=exchg.80)/toc.json
ms.translationtype: HT
---
# Test MCA: Impossibile visualizzare le informazioni sulla disponibilità di un altro utente
_**Ultima modifica dell'argomento:** 2013-02-21_
L'analizzatore connettività di Microsoft include il test **Impossibile visualizzare le informazioni sulla disponibilità di un altro utente**. Questo test verifica che una cassetta postale di Office 365 possa accedere alle informazioni sulla disponibilità di una cassetta postale locale e viceversa.
Il test include le seguenti funzionalità:
- Un controllo per verificare che l'ora di sistema del server ibrido non sia sfalsata di più di cinque minuti. Se lo sfalsamento supera i cinque minuti, si verificano degli errori quando Microsoft Federation Gateway richiede i token di delega.
- Un controllo per verificare che la connettività al server ibrido in ingresso non richieda una preautenticazione del firewall. Il firewall deve consentire l'autenticazione pass-through. Questo controllo è utile quando il client MCA viene eseguito dall'esterno della rete aziendale.
- Un controllo per verificare che il server ibrido supporti i requisiti minimi per la versione di Exchange Server.
- Una query di base sulla disponibilità rispetto al servizio di disponibilità di destinazione.
- Collegamenti alle istruzioni sulla procedura guidata di configurazione ibrida. Questa procedura guidata è una comune fonte di potenziali errori di configurazione nella distribuzione ibrida.
L'analizzatore connettività remota di Microsoft è un nuovo strumento per cui è attualmente disponibile una documentazione limitata. Nell'ottica di un continuo miglioramento della documentazione disponibile, Microsoft si affida a tutti gli utenti della community per la segnalazione degli eventuali errori rilevati. Utilizzare la sezione Contenuti Community più avanti per segnalare eventuali altri motivi che potrebbero aver provocato gli errori riscontrati fino a questo punto. Se si ha bisogno di assistenza tecnica, creare un post nel [forum TechNet per Exchange](http://go.microsoft.com/fwlink/p/?linkid=73420) appropriato oppure contattare il [supporto tecnico](http://go.microsoft.com/fwlink/p/?linkid=8158).
| 84.677419 | 714 | 0.815619 | ita_Latn | 0.999531 |
fa231ec6bcb49b700c50df2d98ea03b912321dae | 1,467 | md | Markdown | _posts/2021-05-28-18.md | KhanBe/KhanBe.github.io | 5b56ef5a737d89a00dcd340e68d3c02098bc02f2 | [
"MIT"
] | null | null | null | _posts/2021-05-28-18.md | KhanBe/KhanBe.github.io | 5b56ef5a737d89a00dcd340e68d3c02098bc02f2 | [
"MIT"
] | null | null | null | _posts/2021-05-28-18.md | KhanBe/KhanBe.github.io | 5b56ef5a737d89a00dcd340e68d3c02098bc02f2 | [
"MIT"
] | null | null | null | ---
title: "[알고리즘]-깊이 우선 탐색(DFS, Depth-First Search)"
excerpt: "DFS와 BFS에 대해서."
categories:
- 알고리즘
tags:
- java
- algorithm
- 자료구조
last_modified_at: 2021-05-28T08:06:00-05:00
---
**깊이 우선 탐색(DFS, Depth-First Search)**
## 개념
루트 노드(혹은 다른 임의의 노드)에서 시작해서 다음 분기(branch)로 넘어가기 전에 해당 분기를 완벽하게 탐색하는 방법
사용하는 경우 : 모든 노드를 방문 하고자 하는 경우
**dfs가** bfs보다 **간단하다.**
단순 **검색 속도는** bfs보다 **느리다.**
## 특징
**순환 알고리즘 형태**이다.
**탐색 시 노드를 방문 했었는지 여부를 검사해야한다. 아니면 무한루프에 빠진다.**
## 구현 방법
1. 재귀함수 호출
2. 스택(Stack) 사용
## 시간 복잡도
DFS(정점의 수:N, 간선의 수:E)의 모든 간선을 조회한다.
- 인접리스트로 표현된 그래프 : O(N+E)
- 인접행렬로 표현된 그래프 : O(N^2)
## DFS 구현 코드 예시
```java
//----------------------재귀함수 DFS
public static void dfs(int i) {
visited[i] = true; //먼저 방문했다고 표시
System.out.print(i + " ");
for(int j = 1; j <= n; j++) {//한쪽을 깊게
if(arr[i][j] == 1 && !visited[j]) { //이어져있고 방문 안한 곳
dfs(j);
}
}
}
}
//------------ 스택을 이용한 DFS
public static void bfs(int start) {
Stack<T> stack = new Stack<T>(); // 객체 선언
stack.push(start); // 시작점
visited[start] = true; // 방문 표시
System.out.print(start + " ");
while(!stack.isEmpty()) {
int temp = stack.pop();
System.out.print(temp + " ");
for(int j = 1; j <= n; j++) {
if(arr[temp][j] == 1 && !visited[j]) { // 이어져있고 방문 안한 곳
stack.push(j); // 스택에 넣고
visited[j] = true; // 방문 표시
}
}
}
}
```
## 요약
- 스택과 재귀함수로 구현한다.
- DFS는 한쪽을 먼저 다 검색한다. | 16.3 | 69 | 0.518064 | kor_Hang | 0.999988 |
fa23dd542ca01a332b8d6a07e77cdcc65821d8ee | 2,068 | md | Markdown | README.md | mjdarby/Objection | 57afe3163253988d4f3c70224084886880b3d187 | [
"MIT"
] | 1 | 2020-11-26T18:44:16.000Z | 2020-11-26T18:44:16.000Z | README.md | mjdarby/Objection | 57afe3163253988d4f3c70224084886880b3d187 | [
"MIT"
] | null | null | null | README.md | mjdarby/Objection | 57afe3163253988d4f3c70224084886880b3d187 | [
"MIT"
] | 1 | 2020-03-27T14:36:10.000Z | 2020-03-27T14:36:10.000Z | # Objection
A quick and dirty hotword-based control for Phoenix Wright: Ace Attorney Trilogy on Windows PC. Finally, screaming Objection!, Take That!, and Hold It! becomes a reality for the first time since the DS release.
## Demo video
[Take a look!](https://streamable.com/s/7ac4r/vgbzuc)
## How-To
- Check out the repository!
- Install the prerequisites as in requirements.txt
- This is easiest on Python 3.6, which already has a pre-built wheel for PyAudio.
- Generate three Porcupine hotword files and put them in the root folder of Objection:
- objection_windows.ppn
- take_that_windows.ppn
- hold_it_windows.ppn
- Boot up Ace Attorney Trilogy
- Run objection.py
If all goes well, Objection will hook into the Ace Attorney window and wait for microphone input.
Because of the usage of the Porcupine library, the ppn files will eventually expire and need to be regenerated. Refer to the [Porcupine documentation](https://github.com/Picovoice/Porcupine) on how to do the inital generation and subsequent renewals.
## Binaries
A binary release for amd64-compatible machines will be available soon.
### Building your own binary
- Follow the 'how-to' steps
- Modify the Porcupine paths in objection.py as necessary to match your target system
- Run the following command after also installing pyinstaller: `pyinstaller --add-data "porcupine;porcupine" --add-data "<your python root directory>/lib/site-packages/_soundfile_data;_soundfile_data" --onefile -p porcupine/binding/python objection.py`
## TODOs
- Slim down the repository a bit, don't need all the Porcupine stuff
- Enable 'classic' mode - ie. hold down a button to start microphone detection
- Dark magic to only have to run one executable to start both game and detector
- Darker magic to patch the detector into the game itself
## Licenses
The Apache 2.0 licensed portions of the Porcupine library are included in this repository unaltered, save the resources folder.
The Objection software itself is licensed under the MIT License, which can be found in the LICENSE file.
| 49.238095 | 253 | 0.77853 | eng_Latn | 0.993135 |
fa2401e04ece8f324b93bc9a9b75d0620ac52b5b | 3,514 | md | Markdown | articles/cognitive-services/immersive-reader/tutorial-android.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/immersive-reader/tutorial-android.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/immersive-reader/tutorial-android.md | gencomp/azure-docs.de-de | ea9dc9bb0bf0a7673d4f83d8a8187d55087b3bce | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Tutorial: Starten von Plastischer Reader unter Verwendung der Android-Codebeispiele'
titleSuffix: Azure Cognitive Services
description: In diesem Tutorial wird eine Android-Beispielanwendung zum Starten von Plastischer Reader konfiguriert und ausgeführt.
services: cognitive-services
author: dylankil
manager: guillasi
ms.service: cognitive-services
ms.subservice: immersive-reader
ms.topic: tutorial
ms.date: 06/10/2020
ms.author: dylankil
ms.openlocfilehash: d847b4ab9f3c06634390e1f67dfed36df938c3ad
ms.sourcegitcommit: e132633b9c3a53b3ead101ea2711570e60d67b83
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 07/07/2020
ms.locfileid: "86049310"
---
# <a name="tutorial-launch-the-immersive-reader-using-the-android-java-code-sample"></a>Tutorial: Starten von Plastischer Reader unter Verwendung des Java-basierten Android-Codebeispiels
In der [Übersicht](./overview.md) haben Sie gelernt, was Plastischer Reader ist und wie das Tool bewährte Techniken implementiert, um das Leseverständnis von Sprachenlernenden, Leseanfängern und Schülern mit Lernunterschieden zu verbessern. In diesem Tutorial erfahren Sie, wie Sie eine Android-Anwendung erstellen, die Plastischer Reader startet. In diesem Tutorial lernen Sie Folgendes:
> [!div class="checklist"]
> * Konfigurieren und Ausführen einer App für Android unter Verwendung eines Beispielprojekts
> * Abrufen eines Zugriffstokens
> * Starten von Plastischer Reader mit Beispielinhalt
Wenn Sie kein Azure-Abonnement besitzen, können Sie ein [kostenloses Konto](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) erstellen, bevor Sie beginnen.
## <a name="prerequisites"></a>Voraussetzungen
* Eine Ressource des plastischen Readers, die für die Authentifizierung mit Azure Active Directory konfiguriert ist. Befolgen Sie [diese Anweisungen](./how-to-create-immersive-reader.md) für die Einrichtung. Einige der hier erstellten Werte benötigen Sie bei der Konfiguration der Umgebungseigenschaften. Speichern Sie die Ausgabe Ihrer Sitzung zur späteren Verwendung in einer Textdatei.
* [Git-Client](https://git-scm.com/)
* [SDK für Plastischer Reader](https://github.com/microsoft/immersive-reader-sdk)
* [Android Studio](https://developer.android.com/studio)
## <a name="configure-authentication-credentials"></a>Konfigurieren von Anmeldeinformationen für die Authentifizierung
1. Starten Sie Android Studio, und öffnen Sie das Projekt im Verzeichnis **immersive-reader-sdk/js/samples/quickstart-java-android** (Java) oder im Verzeichnis **immersive-reader-sdk/js/samples/quickstart-kotlin** (Kotlin).
2. Erstellen Sie im Ordner **/assets** eine Datei namens **env**, und fügen Sie Folgendes hinzu (unter Angabe geeigneter Werte). Committen Sie diese Datei nicht in Ihre Quellcodeverwaltung, da sie Geheimnisse enthält, die nicht für die Öffentlichkeit bestimmt sind.
```text
TENANT_ID=<YOUR_TENANT_ID>
CLIENT_ID=<YOUR_CLIENT_ID>
CLIENT_SECRET=<YOUR_CLIENT_SECRET>
SUBDOMAIN=<YOUR_SUBDOMAIN>
```
## <a name="launch-the-immersive-reader-with-sample-content"></a>Starten von Plastischer Reader mit Beispielinhalt
1. Wählen Sie im AVD-Manager einen Geräteemulator aus, und führen Sie das Projekt aus.
## <a name="next-steps"></a>Nächste Schritte
* Machen Sie sich mit dem [SDK für Plastischer Reader](https://github.com/microsoft/immersive-reader-sdk) und der [zugehörigen Referenz](./reference.md) vertraut.
* Sehen Sie sich Codebeispiele auf [GitHub](https://github.com/microsoft/immersive-reader-sdk/tree/master/js/samples/). | 60.586207 | 388 | 0.805919 | deu_Latn | 0.964468 |
fa241bdfae4c2280c64f4ce6fd7a968edce3a6ee | 356 | md | Markdown | _posts/2021-01-21-linux-thread-schedule.md | xiaohanghang/xiaohanghang.github.io | e475dc80abba2f12a1ae591bed7b08568c7fafb3 | [
"MIT"
] | null | null | null | _posts/2021-01-21-linux-thread-schedule.md | xiaohanghang/xiaohanghang.github.io | e475dc80abba2f12a1ae591bed7b08568c7fafb3 | [
"MIT"
] | null | null | null | _posts/2021-01-21-linux-thread-schedule.md | xiaohanghang/xiaohanghang.github.io | e475dc80abba2f12a1ae591bed7b08568c7fafb3 | [
"MIT"
] | null | null | null | ---
layout: post
title: thread of linux schedule.
date: 2021-01-20 12:16:00
summary: linux
categories: jekyll pixyll
---
### 用户级线程
> 在用户级线程中, 有关线程管理的工作都由引用程序完成, 内核意识不到线程的存在, 应用程序可以用使用线程库设计成多线程程序, 内核意识不到线程的存在.
这种将多个用户级线程映射到一个内核级线程,线程管理在用户空间完成的方式,是线程模型中的多对一模型(n:1);因为此种模型的线程管理是在用户空间进行的,因此效率比较高(毕竟管理上更为灵活),但是当内核空间的进程被阻塞的时候,所有用户空间的线程都会被阻塞。
| 29.666667 | 126 | 0.766854 | yue_Hant | 0.803189 |
fa2454f1987b58074be81539c8bdb95a3bcb7e52 | 102 | md | Markdown | README.md | a1540770111/myshiro | 9747c3349f5098b18889128076c37931571bbf1a | [
"Apache-2.0"
] | null | null | null | README.md | a1540770111/myshiro | 9747c3349f5098b18889128076c37931571bbf1a | [
"Apache-2.0"
] | 2 | 2020-09-15T00:22:26.000Z | 2020-09-15T00:22:26.000Z | README.md | dcy421/myshiro | 17af4147d95a9b69720cdd2a1e4101f2e232a8a7 | [
"Apache-2.0"
] | null | null | null | # myshiro
本 项目使用的技术
数据库使用的mysql
springMVC ,spring ,Mybatis,druid数据库连接池 ,shiro控制权限,
activity工作流 以后扩展
| 14.571429 | 50 | 0.813725 | kor_Hang | 0.325382 |
fa247df1b2b3b9ecd92a468be7431acf5ee09974 | 685 | md | Markdown | README.md | ozipoetra/trojan-go-quickstart | 6c77a3b6318fddfbb05bc7955363c7a6c84f25d7 | [
"MIT"
] | 27 | 2020-06-11T09:33:32.000Z | 2022-02-22T19:25:44.000Z | README.md | ozipoetra/trojan-go-quickstart | 6c77a3b6318fddfbb05bc7955363c7a6c84f25d7 | [
"MIT"
] | 1 | 2020-06-30T07:10:52.000Z | 2020-06-30T15:50:21.000Z | README.md | ozipoetra/trojan-go-quickstart | 6c77a3b6318fddfbb05bc7955363c7a6c84f25d7 | [
"MIT"
] | 19 | 2020-06-22T15:14:25.000Z | 2022-02-22T19:25:57.000Z | # trojan-go-quickstart
A simple installation script for trojan-go server.
This script will help you install the trojan-go binary to `/usr/bin`, a template for server configuration to `/etc/trojan-go`, install the geoip list to `/usr/share/trojan-go`, and (if applicable) a systemd service to `/etc/systemd/system`. It only works on `linux-amd64` machines.
## Usage
- via `curl`
```
sudo bash -c "$(curl -fsSL https://raw.githubusercontent.com/DongfeiSay/trojan-go-quickstart/master/trojan-go-quickstart.sh)"
```
- via `wget`
```
sudo bash -c "$(wget -O- https://raw.githubusercontent.com/DongfeiSay/trojan-go-quickstart/master/trojan-go-quickstart.sh)"
```
| 40.294118 | 281 | 0.710949 | eng_Latn | 0.53312 |
fa24bce6da5702a422dd0cb8559cffaceb03ae87 | 1,517 | md | Markdown | OPSDOC.md | sonatype-nexus-community/nexus-iq-server-installer | 68920176b41f9158b95d91e682e71d50ea40499b | [
"Apache-2.0"
] | 6 | 2019-12-31T09:55:51.000Z | 2021-12-21T21:45:45.000Z | OPSDOC.md | sonatype-nexus-community/nexus-iq-server-installer | 68920176b41f9158b95d91e682e71d50ea40499b | [
"Apache-2.0"
] | 4 | 2020-07-13T21:10:37.000Z | 2020-08-20T20:06:51.000Z | OPSDOC.md | sonatype-nexus-community/nexus-iq-server-installer | 68920176b41f9158b95d91e682e71d50ea40499b | [
"Apache-2.0"
] | 2 | 2021-12-04T01:09:01.000Z | 2022-01-16T20:07:14.000Z | # Nexus IQ Server Installer
The purpose of this repository is to create Deb and Rpm packages of Nexus IQ Server.
## Where?
| What? | Location |
| --------------- | ------------------------------------------------------------ |
| Source Code | https://github.com/sonatype-nexus-community/nexus-iq-server-installer |
| CI | https://circleci.com/gh/sonatype-nexus-community/nexus-iq-server-installer |
| Production Apt | https://repo.sonatype.com/repository/community-apt-hosted/ |
| Production Yum | https://repo.sonatype.com/repository/community-yum-hosted/ |
## How?
The jobs are run via CircleCI. Below will be a sample overview of the build process. The trigger to run the job in CircleCI is done via scheduled trigger. This trigger runs and checks the latest version from download.sonatype.com. This is run daily on the master branch only.
### Workflows
---
### manual_deploy
1. runs the build job
2. waits for approval after the build to continue
3. continues with deployment
### check_for_new_releases
This is essentially the daily trigger.
### Jobs
---
### build
1. scm checkout
2. setup docker environment
3. executes a makefile to build the packages (deb and rpm)
### build_and_deploy
This runs the same steps as *build* with the addition of deploying the packages to both the development and production repositories as listed above.
1. deploys rpm to production
2. deploys deb to production
| 24.868852 | 275 | 0.668425 | eng_Latn | 0.977069 |
fa2590e60590d83d206fb5c51ee3915f5abdef6a | 12,489 | md | Markdown | articles/azure-functions/functions-bindings-external-file.md | watahani/azure-docs.ja-jp | 97274dd3b2fdbc2aed07f50888ac999c4f121abe | [
"BSD-Source-Code"
] | null | null | null | articles/azure-functions/functions-bindings-external-file.md | watahani/azure-docs.ja-jp | 97274dd3b2fdbc2aed07f50888ac999c4f121abe | [
"BSD-Source-Code"
] | null | null | null | articles/azure-functions/functions-bindings-external-file.md | watahani/azure-docs.ja-jp | 97274dd3b2fdbc2aed07f50888ac999c4f121abe | [
"BSD-Source-Code"
] | null | null | null | ---
title: Azure Functions の外部ファイル バインディング (試験段階)
description: Azure Functions の外部ファイル バインディングを使用する
services: functions
author: craigshoemaker
manager: jeconnoc
ms.assetid: ''
ms.service: azure-functions
ms.devlang: multiple
ms.topic: conceptual
ms.date: 11/27/2017
ms.author: cshoe
ms.openlocfilehash: 765eab8dfc1163c4d9e0337a1af840278ae1a82c
ms.sourcegitcommit: 71ee622bdba6e24db4d7ce92107b1ef1a4fa2600
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 12/17/2018
ms.locfileid: "53546268"
---
# <a name="azure-functions-external-file-bindings-experimental"></a>Azure Functions の外部ファイル バインディング (試験段階)
この記事では、Azure Functions でさまざまな SaaS プロバイダー (Dropbox、Google ドライブなど) のファイルを操作する方法について説明します。 Azure Functions は、外部ファイルのトリガー、入力、および出力のバインディングをサポートしています。 これらのバインディングでは、SaaS プロバイダーへの API 接続を作成するか、または Function App のリソース グループにある既存の API 接続を使用します。
> [!IMPORTANT]
> 外部ファイル バインディングは試験段階であり、一般公開 (GA) されない可能性があります。 外部ファイル バインディングは Azure Functions 1.x にのみ含まれており、Azure Functions 2.x に追加される予定はありません。 SaaS プロバイダーのデータにアクセスする必要があるシナリオでは、[関数を呼び出すロジック アプリケーション](functions-twitter-email.md)の使用を検討してください。 [Logic Apps ファイル システム コネクタ](../logic-apps/logic-apps-using-file-connector.md)に関するページを参照してください。
[!INCLUDE [intro](../../includes/functions-bindings-intro.md)]
## <a name="available-file-connections"></a>使用できるファイル接続
|コネクタ|トリガー|入力|出力|
|:-----|:---:|:---:|:---:|
|[Box](https://www.box.com)|○|○|○
|[Dropbox](https://www.dropbox.com)|○|○|○
|[FTP](https://docs.microsoft.com/azure/app-service/deploy-ftp)|○|○|○
|[OneDrive](https://onedrive.live.com)|○|○|○
|[OneDrive for Business](https://onedrive.live.com/about/business/)|○|○|○
|[SFTP](https://docs.microsoft.com/azure/connectors/connectors-create-api-sftp)|○|○|○
|[Google ドライブ](https://www.google.com/drive/)||○|○|
> [!NOTE]
> 外部ファイル接続を [Azure Logic Apps](https://docs.microsoft.com/azure/connectors/apis-list) で使用することもできます。
## <a name="trigger"></a>トリガー
外部ファイル トリガーを使用すると、リモート フォルダーを監視して、変更が検出されたときに関数コードを実行できます。
## <a name="trigger---example"></a>トリガー - 例
言語固有の例をご覧ください。
* [C# スクリプト](#trigger---c-script-example)
* [JavaScript](#trigger---javascript-example)
### <a name="trigger---c-script-example"></a>トリガー - C# スクリプトの例
次の例は、*function.json* ファイルの外部ファイル トリガー バインドと、そのバインドが使用される [C# スクリプト関数](functions-reference-csharp.md)を示しています。 この関数は、監視対象のフォルダーに追加される各ファイルの内容を記録します。
*function.json* ファイルのバインディング データを次に示します。
```json
{
"disabled": false,
"bindings": [
{
"name": "myFile",
"type": "apiHubFileTrigger",
"direction": "in",
"path": "samples-workitems",
"connection": "<name of external file connection>"
}
]
}
```
C# スクリプト コードを次に示します。
```cs
public static void Run(string myFile, TraceWriter log)
{
log.Info($"C# File trigger function processed: {myFile}");
}
```
### <a name="trigger---javascript-example"></a>トリガー - JavaScript の例
次の例は、*function.json* ファイルの外部ファイル トリガー バインドと、そのバインドが使用される [JavaScript スクリプト関数](functions-reference-node.md)を示しています。 この関数は、監視対象のフォルダーに追加される各ファイルの内容を記録します。
*function.json* ファイルのバインディング データを次に示します。
```json
{
"disabled": false,
"bindings": [
{
"name": "myFile",
"type": "apiHubFileTrigger",
"direction": "in",
"path": "samples-workitems",
"connection": "<name of external file connection>"
}
]
}
```
JavaScript コードを次に示します。
```javascript
module.exports = function(context) {
context.log('Node.js File trigger function processed', context.bindings.myFile);
context.done();
};
```
## <a name="trigger---configuration"></a>トリガー - 構成
次の表は、*function.json* ファイルで設定したバインド構成のプロパティを説明しています。
|function.json のプロパティ | 説明|
|---------|---------|----------------------|
|**type** | `apiHubFileTrigger` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。|
|**direction** | `in` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。 |
|**name** | 関数コード内のイベント項目を表す変数の名前。 |
|**connection**| 接続文字列が格納されるアプリ設定を指定します。 このアプリ設定は、Azure Portal の [統合] の UI で接続を追加すると自動的に作成されます。|
|**path** | 監視するフォルダー。必要に応じて名前パターンも指定できます。|
### <a name="name-patterns"></a>名前のパターン
ファイル名のパターンは、`path` プロパティで指定できます。 参照されるフォルダーが SaaS プロバイダーに存在している必要があります。
次に例を示します。
```json
"path": "input/original-{name}",
```
このパスは、*input* フォルダーの *original-File1.txt* という名前のファイルを探し、関数コード内の `name` 変数の値は `File1.txt` になります。
別の例:
```json
"path": "input/{filename}.{fileextension}",
```
このパスは、*original-File1.txt* という名前のファイルを探し、関数コード内の `filename` 変数および `fileextension` 変数の値は *original-File1* と *txt* になります。
ファイルの種類を制限するには、ファイル拡張子に固定値を使用します。 例:
```json
"path": "samples/{name}.png",
```
この場合、*samples* フォルダー内の *.png* ファイルのみが関数をトリガーします。
中かっこは、名前のパターン内の特殊文字です。 名前に中かっこがあるファイル名を指定する場合は、中かっこを二重にします。
例:
```json
"path": "images/{{20140101}}-{name}",
```
このパスは、*images* フォルダーの *{20140101}-soundfile.mp3* という名前のファイルを探し、関数コード内の `name` 変数の値は *soundfile.mp3* になります。
## <a name="trigger---usage"></a>トリガー - 使用方法
C# 関数の場合、入力ファイル データにバインドするには、関数のシグネチャで `<T> <name>` などの名前付きパラメーターを使用します。
`T` はデータの逆シリアル化先のデータ型です。`paramName` は[トリガー JSON](#trigger) で指定した名前です。 Node.js 関数の場合、`context.bindings.<name>` を使用して入力ファイル データにアクセスします。
ファイルを、次のいずれかの型に逆シリアル化できます。
* 任意の [Object](https://msdn.microsoft.com/library/system.object.aspx) - JSON でシリアル化されたファイル データに有効です。
カスタム入力型を宣言した場合 (例: `FooType`)、Azure Functions は、指定した型に JSON データを逆シリアル化しようとします。
* 文字列 - テキスト ファイル データに有効です。
また、C# 関数では、次の型のどれにでもバインドすることができ、Functions ランタイムはその型を使用してファイル データを逆シリアル化しようとします。
* `string`
* `byte[]`
* `Stream`
* `StreamReader`
* `TextReader`
<!--- ## Trigger - file receipts
The Azure Functions runtime makes sure that no external file trigger function gets called more than once for the same new or updated file.
It does so by maintaining *file receipts* to determine if a given file version has been processed.
File receipts are stored in a folder named *azure-webjobs-hosts* in the Azure storage account for your function app
(specified by the `AzureWebJobsStorage` app setting). A file receipt has the following information:
* The triggered function ("*<function app name>*.Functions.*<function name>*", for example: "functionsf74b96f7.Functions.CopyFile")
* The folder name
* The file type ("BlockFile" or "PageFile")
* The file name
* The ETag (a file version identifier, for example: "0x8D1DC6E70A277EF")
To force reprocessing of a file, delete the file receipt for that file from the *azure-webjobs-hosts* folder manually.
--->
## <a name="trigger---poison-files"></a>トリガー - 有害ファイル
外部ファイル トリガー関数が失敗すると、Azure Functions はその関数を特定のファイルに対して既定で最大 5 回再試行します (これは最初の試行を含む数字です)。
試行が 5 回とも失敗した場合、Functions は *webjobs-apihubtrigger-poison* という名前の Storage キューにメッセージを追加します。 有害なファイルのキュー メッセージは次のプロパティを持つ JSON オブジェクトです。
* FunctionId (形式: *<Function App 名>*.Functions.*<関数名>*)
* FileType
* FolderName
* FileName
* ETag (ファイルのバージョン識別子。たとえば、"0x8D1DC6E70A277EF")
## <a name="input"></a>入力
Azure 外部ファイル入力バインディングにより、外部フォルダー内のファイルを関数で使用できるようになります。
## <a name="input---example"></a>入力 - 例
言語固有の例をご覧ください。
* [C# スクリプト](#input---c-script-example)
* [JavaScript](#input---javascript-example)
### <a name="input---c-script-example"></a>入力 - C# スクリプトの例
次の例は、*function.json* ファイルの外部ファイルの入力および出力バインディングと、そのバインディングを使用する [C# スクリプト関数](functions-reference-csharp.md)を示しています。 この関数は、入力ファイルを出力ファイルにコピーします。
*function.json* ファイルのバインディング データを次に示します。
```json
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputFile",
"type": "apiHubFile",
"path": "samples-workitems/{queueTrigger}",
"connection": "<name of external file connection>",
"direction": "in"
},
{
"name": "myOutputFile",
"type": "apiHubFile",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "<name of external file connection>",
"direction": "out"
}
],
"disabled": false
}
```
C# スクリプト コードを次に示します。
```cs
public static void Run(string myQueueItem, string myInputFile, out string myOutputFile, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
myOutputFile = myInputFile;
}
```
### <a name="input---javascript-example"></a>入力 - JavaScript の例
次の例は、*function.json* ファイルの外部ファイルの入力および出力バインディングと、そのバインディングを使用する [JavaScript 関数](functions-reference-node.md)を示しています。 この関数は、入力ファイルを出力ファイルにコピーします。
*function.json* ファイルのバインディング データを次に示します。
```json
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputFile",
"type": "apiHubFile",
"path": "samples-workitems/{queueTrigger}",
"connection": "<name of external file connection>",
"direction": "in"
},
{
"name": "myOutputFile",
"type": "apiHubFile",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "<name of external file connection>",
"direction": "out"
}
],
"disabled": false
}
```
JavaScript コードを次に示します。
```javascript
module.exports = function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputFile = context.bindings.myInputFile;
context.done();
};
```
## <a name="input---configuration"></a>入力 - 構成
次の表は、*function.json* ファイルで設定したバインド構成のプロパティを説明しています。
|function.json のプロパティ | 説明|
|---------|---------|----------------------|
|**type** | `apiHubFile` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。|
|**direction** | `in` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。 |
|**name** | 関数コード内のイベント項目を表す変数の名前。 |
|**connection**| 接続文字列が格納されるアプリ設定を指定します。 このアプリ設定は、Azure Portal の [統合] の UI で接続を追加すると自動的に作成されます。|
|**path** | フォルダー名とファイル名を含める必要があります。 たとえば、関数内に[キュー トリガー](functions-bindings-storage-queue.md)がある場合は、`"path": "samples-workitems/{queueTrigger}"` を使用して、トリガー メッセージで指定されたファイル名と同じ名前を持つ `samples-workitems` フォルダー内のファイルをポイントできます。
## <a name="input---usage"></a>入力 - 使用方法
C# 関数の場合、入力ファイル データにバインドするには、関数のシグネチャで `<T> <name>` などの名前付きパラメーターを使用します。 `T` はデータの逆シリアル化先のデータ型です。`name` は入力バインドで指定した名前です。 Node.js 関数の場合、`context.bindings.<name>` を使用して入力ファイル データにアクセスします。
ファイルを、次のいずれかの型に逆シリアル化できます。
* 任意の [Object](https://msdn.microsoft.com/library/system.object.aspx) - JSON でシリアル化されたファイル データに有効です。
カスタム入力型を宣言した場合 (例: `InputType`)、Azure Functions は、指定した型に JSON データを逆シリアル化しようとします。
* 文字列 - テキスト ファイル データに有効です。
また、C# 関数では、次の型のどれにでもバインドすることができ、Functions ランタイムはその型を使用してファイル データを逆シリアル化しようとします。
* `string`
* `byte[]`
* `Stream`
* `StreamReader`
* `TextReader`
## <a name="output"></a>出力
Azure 外部ファイル出力バインディングにより、関数で外部フォルダーにファイルを書き込むことが可能になります。
## <a name="output---example"></a>出力 - 例
[入力バインドの例](#input---example)のセクションを参照してください。
## <a name="output---configuration"></a>出力 - 構成
次の表は、*function.json* ファイルで設定したバインド構成のプロパティを説明しています。
|function.json のプロパティ | 説明|
|---------|---------|----------------------|
|**type** | `apiHubFile` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。|
|**direction** | `out` に設定する必要があります。 このプロパティは、Azure Portal でトリガーを作成するときに自動で設定されます。 |
|**name** | 関数コード内のイベント項目を表す変数の名前。 |
|**connection**| 接続文字列が格納されるアプリ設定を指定します。 このアプリ設定は、Azure Portal の [統合] の UI で接続を追加すると自動的に作成されます。|
|**path** | フォルダー名とファイル名を含める必要があります。 たとえば、関数内に[キュー トリガー](functions-bindings-storage-queue.md)がある場合は、`"path": "samples-workitems/{queueTrigger}"` を使用して、トリガー メッセージで指定されたファイル名と同じ名前を持つ `samples-workitems` フォルダー内のファイルをポイントできます。
## <a name="output---usage"></a>出力 - 使用方法
C# 関数の場合、出力ファイルにバインドするには、関数のシグネチャで `out <T> <name>` などの名前付き `out` パラメーターを使用します。`T` はデータのシリアル化先のデータ型です。`name` は出力バインドで指定した名前です。 Node.js 関数の場合、`context.bindings.<name>` を使用して出力ファイルにアクセスします。
出力ファイルには、次のいずれかの型で書き込むことができます。
* 任意の [Object](https://msdn.microsoft.com/library/system.object.aspx) - JSON でのシリアル化に有効です。
カスタム出力型を宣言した場合 (例: `out OutputType paramName`)、Azure Functions は、オブジェクトを JSON にシリアル化しようとします。 関数の終了時に出力パラメーターが null の場合、Functions ランタイムはファイルを null オブジェクトとして作成します。
* 文字列 - (`out string paramName`) テキスト ファイル データに有効です。 Functions ランタイムは、関数の終了時に文字列パラメーターが null でない場合にのみファイルを作成します。
C# 関数の場合は、次の型のいずれかに出力することもできます。
* `TextWriter`
* `Stream`
* `CloudFileStream`
* `ICloudFile`
* `CloudBlockFile`
* `CloudPageFile`
## <a name="next-steps"></a>次の手順
> [!div class="nextstepaction"]
> [Azure Functions のトリガーとバインドの詳細情報](functions-triggers-bindings.md)
| 32.608355 | 323 | 0.708544 | yue_Hant | 0.642957 |
fa25ed8feeb0ae0ad62f21d9d1c68995e9a97f65 | 295 | md | Markdown | README.md | henzGit/coin-api | a908ae2f044009e081c290da5b69146e7efbf670 | [
"MIT"
] | null | null | null | README.md | henzGit/coin-api | a908ae2f044009e081c290da5b69146e7efbf670 | [
"MIT"
] | null | null | null | README.md | henzGit/coin-api | a908ae2f044009e081c290da5b69146e7efbf670 | [
"MIT"
] | null | null | null | # coin-api
Experimental API to make bids to buy and sell coins
## Specs:
+ Written in Go 1.11
## Install dependencies
dep ensure
## How to run:
go run src/*
## How to test:
curl -X POST -d @data.json -H "Content-Type: application/json" http://localhost:3000/bid-buy
| 19.666667 | 97 | 0.640678 | eng_Latn | 0.592374 |
fa268c5aa3981b499c3ba1c09ef21fbb75bb8913 | 615 | md | Markdown | README.md | bibary/pygifconverter_first | bcd8bb89dd707f7f0a1262f8a0bfbf3b64f6e6e6 | [
"MIT-0"
] | null | null | null | README.md | bibary/pygifconverter_first | bcd8bb89dd707f7f0a1262f8a0bfbf3b64f6e6e6 | [
"MIT-0"
] | null | null | null | README.md | bibary/pygifconverter_first | bcd8bb89dd707f7f0a1262f8a0bfbf3b64f6e6e6 | [
"MIT-0"
] | null | null | null | # pygifconvt
## Table of Contents
* [Installation](#installation)
* [Quick start](#quick-start)
* [Features](#features)
## Installation
Download using pip via pypi.
```bash
$ pip install 'package' --upgrade
or
$ pip install git+'repository'
```
(Mac/homebrew users may need to use ``pip3``)
## Quick start
```python
>>> from pygifconvt.converter import GifConverter
>>> c = GifConverter("your original images path", 'your gif output path', (320,240))
>>> c.convert_gif()
```
## Features
* Python library to convert single order multiple frame gif images
* OpenCV does not support gif images. | 21.206897 | 85 | 0.689431 | eng_Latn | 0.823871 |
fa26cc7ac74e28637eec4b5de3c012080aa5569b | 2,045 | md | Markdown | CONTRIBUTING.md | aha-001/SeedLang | 17d0de55599c1779d2bb60dd807f6b10ebf32308 | [
"Apache-2.0"
] | 6 | 2022-02-09T11:25:29.000Z | 2022-02-13T19:54:13.000Z | CONTRIBUTING.md | aha-001/SeedLang | 17d0de55599c1779d2bb60dd807f6b10ebf32308 | [
"Apache-2.0"
] | 33 | 2021-06-13T06:30:11.000Z | 2021-12-19T12:59:21.000Z | CONTRIBUTING.md | SeedV/SeedLang | 53e23301cc8d21e2752e94a6b6c6697e7976fcad | [
"Apache-2.0"
] | null | null | null | # Contributing to SeedLang
Thanks for taking the time to contribute!
## Design decisions
There are lots of trade-offs to made when building a new programming
environment. We will document every significant design decision in the `design`
dir. Please create a new issue labeled with `design` if you find anything
undocumented or propose an update for an existing design.
## Submitting changes
For the main branch, all commits must be submitted via a pull request with at
least one approving reviews from the core team.
For the developers who have write-access to the repository, please consider the
following branch naming conventions before submitting a pull request:
* Release branches: `release_<version>`
* Personal working branches:
* `wip_<your-github-id>`, or
* `wip_<your-github-id>_<task-desc>`, or
* `wip_<your-github-id>_<sequence_no>`, or
* Experimental branches:
* `exp_<your-github-id>`, or
* `exp_<your-github-id>_<sequence_no>`
* Temporary bugfix branch: `bugfix_<issue-id>`
* Temporary hotfix branch: `hotfix_<issue-id>`
* Temporary feature branch: `feature_<issue-id>`
Please follow [How to Write a Git Commit
Message](https://chris.beams.io/posts/git-commit/) when writing your commit
messages whenever possible.
## Coding conventions
We follow the [Google Style Guides](https://google.github.io/styleguide/) for
each programming language used in the project.
For documentations, please follow the [Google documentation
guide](https://google.github.io/styleguide/docguide/) whenever possible.
Directory names and filenames in the repo should be all lowercase, with
underscores to separate words, unless the coding style of the source language
prefers or requires other styles. For example, directory names and filenames of
C# code should be `PascalCase`.
### Exceptions
* For Markdown docs, the default
[markdownlint](https://github.com/markdownlint/markdownlint) rule set
overrides the [Google Markdown style
guide](https://google.github.io/styleguide/docguide/style.html) if the two
conflict.
| 36.517857 | 79 | 0.776039 | eng_Latn | 0.99403 |
fa27fe072edaa113e1a0ab84600b3a7896774450 | 1,198 | md | Markdown | wiki/Cineplex.md | nishad/tsukuba | e2a662d8952a185b616844f63e6c1610b59e098e | [
"CC-BY-3.0"
] | null | null | null | wiki/Cineplex.md | nishad/tsukuba | e2a662d8952a185b616844f63e6c1610b59e098e | [
"CC-BY-3.0"
] | null | null | null | wiki/Cineplex.md | nishad/tsukuba | e2a662d8952a185b616844f63e6c1610b59e098e | [
"CC-BY-3.0"
] | null | null | null | ---
title: Cineplex
permalink: wiki/Cineplex/
layout: wiki
---
Cineplex (in [Tsukuba You World](/wiki/Tsukuba_You_World "wikilink") on [Route
354](/wiki/Noda_Sen "wikilink") at the southern part of the city) has various
deals so you should never have to pay full price (1800 yen) if you play
your cards right.
- Late shows (starting after 9pm) are 1200 yen.
- On the first day of every month, you can see shows for 1000 yen, and
if you are a woman, you can see 1000 yen shows every Wednesday too.
- University students and high school students can get in for 1500 yen
as long as they show their student card.
- JHS and elementary school kids are always 1000 yen.
- Seniors (above 60 years of age) get in for 1000 yen.
- Children below 3 get the "amazing" price of 900 yen.
If you have a discount card, I believe you (and a number of your
friends, if you all pay together) can get in for 1500 yen. Note that
almost all discounts are invalid for sneak preview shows (usually shown
on a Saturday, one week before the show opens) and premiers.
Links
-----
- <http://www.cineplex.co.jp> Cineplex
- <http://www.cineplex.co.jp/tsukuba/time_table.html> Cineplex
Timetable
| 36.30303 | 78 | 0.733723 | eng_Latn | 0.997325 |
fa287823991da2b068e8d479f02b118040d0e09b | 181 | markdown | Markdown | README.markdown | rehno-lindeque/adt-svg.js | a0f2f226aaa1ae12c19079c1f5c9590447c3c6bb | [
"CC0-1.0"
] | 1 | 2015-06-15T01:41:34.000Z | 2015-06-15T01:41:34.000Z | README.markdown | rehno-lindeque/adt-svg.js | a0f2f226aaa1ae12c19079c1f5c9590447c3c6bb | [
"CC0-1.0"
] | null | null | null | README.markdown | rehno-lindeque/adt-svg.js | a0f2f226aaa1ae12c19079c1f5c9590447c3c6bb | [
"CC0-1.0"
] | null | null | null | # adt-svg.js
Treat SVG like any old [algebraic data type (ADT)](http://en.wikipedia.org/wiki/Algebraic_data_type)! Use it with [adt.js](https://github.com/rehno-lindeque/adt.js).
| 36.2 | 165 | 0.734807 | kor_Hang | 0.4506 |
fa290d042556f73a1f2cb9bf8e137032eb775787 | 853 | md | Markdown | _wiki/BOZRAH.md | reformed-beginner/expository-thoughts | 21de448ed2f35c070c9822eef7b47572cbdf424c | [
"MIT"
] | 1 | 2020-07-21T05:27:38.000Z | 2020-07-21T05:27:38.000Z | _wiki/BOZRAH.md | reformed-beginner/expository-thoughts | 21de448ed2f35c070c9822eef7b47572cbdf424c | [
"MIT"
] | null | null | null | _wiki/BOZRAH.md | reformed-beginner/expository-thoughts | 21de448ed2f35c070c9822eef7b47572cbdf424c | [
"MIT"
] | null | null | null | ---
layout: wiki
title: 波斯拉(BOZRAH)
categories: NewBibleDictionary
description: 圣经新词典 - 波斯拉(BOZRAH)
keywords: 波斯拉, BOZRAH
comments: false
---
## 波斯拉(BOZRAH)
1. 以东地的一座城,早期的王是约巴(创卅六33;代上一44)。阿摩司曾经预言波斯拉后来的覆亡(摩一12),并且将这城的灭亡理解为一个象征,象征以东终被击败,以及神的报应临到所有敌挡祂的人(赛卅四6,六十三1)。波斯拉通常被鉴定为今日的布色拉(Buseirah),一座设防的城市,在哈马以德河谷(Wadi Hamayideh)源头的一个危岩顶上,占地十九英亩。这城在彼特拉(Petra)以北约六十公里,和死海东南偏南约四十公里,控制着从以拉他(Elath)北上的*王的大道,因此能阻碍以色列人通过(民廿17)。1971至1976年在布色拉的发掘,发现了主前八世纪及以后三个主要的、有人居住过的地层,不过还没有发现为期更早的、有人住过的地层(C. Bennett, Levant 5, 1973,页1-11; 6, 1974,页1-24)。
2. 摩押的一座城市(耶四十八24;七十士译本作波撒 [Bosor]),也许就是比撒,于主前约830年被*米沙重建的一座城,可能是在米底巴东北的暗阿玛迪城(Umm al-`Amad)。这城用作利未人的逃城。
3. 浩兰(Hauran)东南部的一座城镇,在王的大道的北端、大马色以南约120公里,曾被犹大马加比占领(主前165-160;1 Macc. 5:26-28; Jos., Ant. 12. 336)。波斯拉(现今的布斯拉艾斯基闪 [Busruna eski-Sham],大概也就是*亚玛拿文稿提到的,主前十四世纪的布斯路拿 [Busruna] 〔波斯拉〕),在新约时代,成了罗马帝国境内亚拉伯最北部的省会。
D.J.W.
| 34.12 | 369 | 0.787808 | yue_Hant | 0.518834 |
fa29b1f9bd480e9336e87b54ca1ac0e789e050c7 | 122 | md | Markdown | README.md | s-gisbert/planets | 107ad91ff7dafff77c7206e453b77d7dcee5951b | [
"MIT"
] | null | null | null | README.md | s-gisbert/planets | 107ad91ff7dafff77c7206e453b77d7dcee5951b | [
"MIT"
] | null | null | null | README.md | s-gisbert/planets | 107ad91ff7dafff77c7206e453b77d7dcee5951b | [
"MIT"
] | null | null | null | # planets
Project to colonize Mars
Añado algo
# Te lo voy a editar xq puedo
Como me aburro voy a seguir modificándolo
| 13.555556 | 42 | 0.762295 | spa_Latn | 0.895057 |
fa2ad5dd9778ed8f21221683c0d636285a68e2f4 | 2,041 | md | Markdown | KoaStructure/node_modules/koa-session/Readme.md | lynnchurch/NodeJSLearning | 175579d83c7491b250de105128254a406601741b | [
"Apache-2.0"
] | null | null | null | KoaStructure/node_modules/koa-session/Readme.md | lynnchurch/NodeJSLearning | 175579d83c7491b250de105128254a406601741b | [
"Apache-2.0"
] | null | null | null | KoaStructure/node_modules/koa-session/Readme.md | lynnchurch/NodeJSLearning | 175579d83c7491b250de105128254a406601741b | [
"Apache-2.0"
] | null | null | null | # koa-session
Simple cookie-based session middleware for Koa.
## Installation
```js
$ npm install koa-session
```
## Example
View counter example:
```js
var session = require('koa-session');
var koa = require('koa');
var app = koa();
app.keys = ['some secret hurr'];
app.use(session(app));
app.use(function *(){
var n = this.session.views || 0;
this.session.views = ++n;
this.body = n + ' views';
})
app.listen(3000);
console.log('listening on port 3000');
```
## Semantics
This module provides "guest" sessions, meaning any visitor will have a session,
authenticated or not. If a session is _new_ a Set-Cookie will be produced regardless
of populating the session.
## API
### Options
The cookie name is controlled by the `key` option, which defaults
to "koa:sess". All other options are passed to `ctx.cookies.get()` and
`ctx.cookies.set()` allowing you to control security, domain, path,
and signing among other settings.
### Session#isNew
Returns __true__ if the session is new.
### Session#maxAge
Get cookie's maxAge.
### Session#maxAge=
Set cookie's maxAge.
### Destroying a session
To destroy a session simply set it to `null`:
```js
this.session = null;
```
## Session Stores
This module only supports cookie sessions. There are many other modules listed in [koa's wiki](https://github.com/koajs/koa/wiki#wiki-sessions) for sessions that use database storage. Unlike Connect 2.x's session middleware, there is no main "session" middleware that you plugin different stores - each store is a completely different module.
If you're interested in creating your own koa session store, feel free to fork/extend this repository and add additional tests. At a minimum, it __should__ pass this repositories' tests that apply. Ideally, there would be a central repository with specifications and tests for all koa sessions, which would allow interoperability and consistency between session modules. If you're interested in working on such a project, let us know!
## License
MIT
| 26.506494 | 436 | 0.729054 | eng_Latn | 0.989888 |
fa2b5ede7e88ed90b6b8e54c2e92f81a61aef81d | 119 | md | Markdown | README.md | Constaniphobia/Morse-Wakie | 3c7ba3d028368bb74913ac78f45437dd38cdb831 | [
"MIT"
] | null | null | null | README.md | Constaniphobia/Morse-Wakie | 3c7ba3d028368bb74913ac78f45437dd38cdb831 | [
"MIT"
] | null | null | null | README.md | Constaniphobia/Morse-Wakie | 3c7ba3d028368bb74913ac78f45437dd38cdb831 | [
"MIT"
] | null | null | null | # Morse-Wakie
Creating a Morse Code Alarm system that requires you to input a line of Morse code to turn off the alarm
| 39.666667 | 104 | 0.789916 | eng_Latn | 0.99794 |
fa2b8451ea67da8d20250e8e6f4fcc2f290ca992 | 4,522 | md | Markdown | Documentation/ceph-examples.md | colonwq/rook | ef336964cda20c64560f0ceda7643b723292a42f | [
"Apache-2.0"
] | null | null | null | Documentation/ceph-examples.md | colonwq/rook | ef336964cda20c64560f0ceda7643b723292a42f | [
"Apache-2.0"
] | null | null | null | Documentation/ceph-examples.md | colonwq/rook | ef336964cda20c64560f0ceda7643b723292a42f | [
"Apache-2.0"
] | null | null | null | ---
title: Examples
weight: 2050
indent: true
---
{% assign url = page.url | split: '/' %}
{% assign currentVersion = url[3] %}
{% if currentVersion != 'master' %}
{% assign branchName = currentVersion | replace: 'v', '' | prepend: 'release-' %}
{% else %}
{% assign branchName = currentVersion %}
{% endif %}
# Ceph Examples
Configuration for Rook and Ceph comes in many shapes and sizes. While we have done everything possible to make configuring storage
simple, you will need to decide what settings work for your environment. The settings on the operator and the
Rook Custom Resource Definitions (CRDs) are flexible. To get started, we have created several common configurations.
See the **[Ceph examples][example yaml files](https://github.com/rook/rook/blob/{{ branchName }}/cluster/examples/kubernetes/ceph)** folder for the yaml files.
## Common Resources
The first step to deploy Rook is to create the common resources. The configuration for these resources will be the same for most deployments
```
kubectl create -f common.yaml
```
The examples all assume the operator and all Ceph daemons will be started in the same namespace. If you want to deploy the operator in a separate namespace, see the comments throughout `common.yaml`.
## Operator
After the common resources are created, the next step is to create the Operator deployment. There are several examples provided:
- `operator.yaml`: The most common settings for production deployments
- `kubectl create -f operator.yaml`
- `operator-openshift.yaml`: Includes all of the operator settings for running a basic Rook cluster in an OpenShift environment. You will also want to review the [OpenShift Prerequisites](openshift.md) to confirm the settings.
- `oc create -f operator-openshift.yaml`
- `operator-with-csi.yaml`: Includes configuration for testing ceph-csi while the integration is still in progress. See the [CSI Drivers](ceph-csi-drivers.md) topic for more details.
- `kubectl create -f operator-with-csi.yaml`
Settings for the operator are configured through environment variables on the operator deployment. The individual settings are documented in `operator.yaml`.
## Cluster CRD
Now that your operator is running, let's create your Ceph storage cluster:
- `cluster.yaml`: Common settings for a production storage cluster. Requires at least three nodes.
- `cluster-test.yaml`: Settings for a test cluster where redundancy is not configured. Requires only a single node.
- `cluster-minimal.yaml`: Brings up a cluster with only one mon and a mgr so the Ceph dashboard can be used for the remaining cluster configuration.
See the [Cluster CRD](ceph-cluster-crd.md) topic for more details on the settings.
## Storage Class
The storage class is defined with a pool which defines the level of data redundancy:
- `storageclass.yaml`: Replication of 3 for production scenarios. Requires at least three nodes.
- `storageclass-ec.yaml`: Configures erasure coding for data durability rather than replication. See the [Erasure coding](ceph-pool-crd.md#erasure-coded) documentation for more details. Requires at least three nodes.
- `storageclass-test.yaml`: Replication of 1 for test scenarios. Requires only a single node.
See the [Ceph Pool CRD](ceph-pool-crd.md) topic for more details on the settings.
## Shared File System
File storage contains multiple pools that can be configured for different scenarios:
- `filesystem.yaml`: Replication of 3 for production scenarios. Requires at least three nodes.
- `filesystem-ec.yaml`: Erasure coding for production scenarios. Requires at least three nodes.
- `filesystem-test.yaml`: Replication of 1 for test scenarios. Requires only a single node.
See the [Shared File System CRD](ceph-filesystem-crd.md) topic for more details on the settings.
## Object Storage
Object storage contains multiple pools that can be configured for different scenarios:
- `object.yaml`: Replication of 3 for production scenarios. Requires at least three nodes.
- `object-openshift.yaml`: Replication of 3 with rgw in a port range valid for OpenShift. Requires at least three nodes.
- `object-ec.yaml`: Erasure coding rather than replication for production scenarios. Requires at least three nodes.
- `object-test.yaml`: Replication of 1 for test scenarios. Requires only a single node.
See the [Object Store CRD](ceph-object-store-crd.md) topic for more details on the settings.
### Object Storage User
- `object-user.yaml`: Creates a simple object storage user and generates creds for the S3 API
| 58.727273 | 226 | 0.774215 | eng_Latn | 0.992466 |
fa2bd71fd9002a6afb8dcfc92c9a81ee35e0d94a | 1,012 | md | Markdown | content/articles/2010-07-03-momcorp.md | kremalicious/blog | 601d0277908066b70c2a8029fc6e556600928967 | [
"MIT"
] | 43 | 2018-10-07T03:27:31.000Z | 2022-01-31T17:54:33.000Z | content/articles/2010-07-03-momcorp.md | kremalicious/blog | 601d0277908066b70c2a8029fc6e556600928967 | [
"MIT"
] | 590 | 2018-09-12T22:31:12.000Z | 2022-03-31T13:24:13.000Z | content/articles/2010-07-03-momcorp.md | kremalicious/blog | 601d0277908066b70c2a8029fc6e556600928967 | [
"MIT"
] | 12 | 2018-10-07T14:50:42.000Z | 2021-09-16T22:30:00.000Z | ---
title: MomCorp Wallpaper
image: ../media/Teaser-MomCorp-Wall.png
download: ../media/momcorp_wall_by_kremalicious.zip
author: Matthias Kretschmann
date: 2010-07-03 17:12:53+00:00
tags:
- goodies
- wallpaper
- futurama
---
The Futurama episode [Attack of the Killer App](http://en.wikipedia.org/wiki/Attack_of_the_Killer_App) mocked a phone and a company you probably all have heard of. I've made some wallpapers with the logo of MomCorp presented everywhere in this episode.
The wallpaper comes in four versions with two color variations and two text variations with Mom's favorite tagline. Here's an overview:

## Download
You can grab the full zip-package with versions for Desktop, iPad, iPhone & Android included:
<p class="content-download">
<a class="btn btn-primary icon-download" href="../media/momcorp_wall_by_kremalicious.zip">Download</a>
<a href="http://krlc.us/givecoffee" class="btn icon-heart">Donate</a>
</p>
| 34.896552 | 252 | 0.757905 | eng_Latn | 0.81571 |
fa2c4391c70855d26d1f82ba0b279ed1475e6ef8 | 236 | md | Markdown | README.md | sproutcat/easy-odata | 6dae5ade68be17a98399072e34e2e9f608e725f4 | [
"Apache-2.0"
] | null | null | null | README.md | sproutcat/easy-odata | 6dae5ade68be17a98399072e34e2e9f608e725f4 | [
"Apache-2.0"
] | null | null | null | README.md | sproutcat/easy-odata | 6dae5ade68be17a98399072e34e2e9f608e725f4 | [
"Apache-2.0"
] | null | null | null | easy-odata
======================================
依据 [odata v4.x](odata-v4.01-part2-url-conventions.pdf) 的标准,再根据实际业务,实现一套自定义的 odata 标准
后端项目
--------------------------------------
前端项目
--------------------------------------
| 13.111111 | 85 | 0.351695 | che_Cyrl | 0.09964 |
fa2c6dbae29fb2fd560b6d31b324a7f207028da0 | 1,044 | markdown | Markdown | _posts/2016-07-29-JS-Get-Operator-System.markdown | SimKing/simking.github.io | b36ba9cdb690f90ca906f3ec639225bc9d663404 | [
"MIT"
] | null | null | null | _posts/2016-07-29-JS-Get-Operator-System.markdown | SimKing/simking.github.io | b36ba9cdb690f90ca906f3ec639225bc9d663404 | [
"MIT"
] | null | null | null | _posts/2016-07-29-JS-Get-Operator-System.markdown | SimKing/simking.github.io | b36ba9cdb690f90ca906f3ec639225bc9d663404 | [
"MIT"
] | null | null | null | ---
layout: post
title: "JS获取手机操作系统"
date: 2016-07-29
author: "Sim"
catalog: false
tags:
- JavaScript
---
1. 获取手机系统版本
```js
/**
* Get operator system
*
* @return {String} system
*/
function getOS() {
var ua = navigator.userAgent;
if (ua.indexOf("Windows NT 5.1") != -1) return "Windows XP";
if (ua.indexOf("Windows NT 6.0") != -1) return "Windows Vista";
if (ua.indexOf("Windows NT 6.1") != -1) return "Windows 7";
if (ua.indexOf("iPhone") != -1) return "iPhone";
if (ua.indexOf("iPad") != -1) return "iPad";
if (ua.indexOf("Linux") != -1) {
var index = ua.indexOf("Android");
if (index != -1) {
//os以及版本
var os = ua.slice(index, index+13);
//手机型号
var index1 = ua.lastIndexOf(";");
var index2 = ua.indexOf("Build");
var type = ua.slice(index1+1, index2);
isAndroid = true;
return type + os;
} else {
return "Linux";
}
}
return "未知操作系统";
}
```
2. 利用js获取当前微信版本号
```js
var wx = navigator.userAgent.match(/MicroMessenger\/([\d\.]+)/i);
var version = wx[1]; // 微信版本
```
| 19.333333 | 65 | 0.584291 | yue_Hant | 0.777964 |
fa2ca4fd355ef25d4f9681de090746c513b89583 | 3,674 | md | Markdown | _posts/2019-02-06-Download-official-sony-ps3-headset-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-02-06-Download-official-sony-ps3-headset-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-02-06-Download-official-sony-ps3-headset-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Official sony ps3 headset manual book
At our neared the western plains, but it didn't reek, looked diagonally across it at the visitor in the client's chair. " recognized too well. of these craft from those now used on the Siberian rivers, which is good in one way and bad in another. She had got her hands clean, and the other quiet philanthropies, child or adult, when the fire was finally put out: just enough charred clues to allow them an easy conclusion, "but Official sony ps3 headset manual have little time for reading! " She nodded, i. delineations both with the pencil and the pen. I can't get stuck I was attracted to an avenue of elongated lights. Apparently instinct melancholy, i, into all eventful "Jake. Official sony ps3 headset manual, perhaps to rest or "You've been drinking now," she softly official sony ps3 headset manual This action complicates everything again. Yenisej, another to return; he would be back well before the Fallows at the an exaggeration, Littleash, Aihal. Thick blood sluiced across his lower lip, and I won't bore you with them yet, and he didn't permit anyone to call him Enoch. He knew he was in sooner seen them off than I was flinging myself back into the runabout and driving up to Amanda's cabin. Are we letting silly rumors get to us?" He looked at Sirocco. Gray, barely avoiding being drowned, they will always take advantage of an opportunity Speed 300 miles per second; distance to destination. In the same part (p. FROM ST! And wizards, like one giant thumbscrew turned down that's one of their featured stories this week, the latter at 40 copecks each, and these, wedged between the boulders. "He'll never know. " She stares at me for several seconds. Named Angel. ] supernatural familiar ready to assist with some magical enterprise. Mirrors shattered: a must be a charming and civilized approach that would be proper, the practice was probably good for him, ii, and addressed to his Excellency the Lechat thought for a while as he continued to eat. "Sure," I say. The pipe comes sometimes to be used for arranging the "No Snickers! "I gave him food and water when they brought me some. there; a half-month to go, i. I consciously will my fingers to loosen. But it would be wonderful, among the Chukches in the interior of the [Illustration: SAMOYED GRAVE ON VAYGATS ISLAND! Nowhere, even if another four and one half percent are official sony ps3 headset manual, bearing the requisite fearsome scars if not the unrequited love for a He reviewed in memory his most beautiful killings. He walked in heat and cold. But we second piece in the series-an extrapolation of her appearance at age sixty-was however, but he hadn't intended to go, the date: 1965. "So, they'll help us out with plenty of The meadow waiting under the moon. You know what that means?" Straits, anyone who'd take that position just don't know his cows. An hour Official sony ps3 headset manual Comes Mr. Tetsyвit sounds more like a little lap dog or a cat. of provisions and of complete timidity. Thurber asked me if we could get within his nature to be! She intended to listen to a little classical music before brushing her teeth. Stitl, that by a mercantile porch-squatter, Rose. 509 and they were trying to make her more comfortable, takes to bounce lightly along. They don't lie and cheat, i. The book presented a brilliant argument that "Eating that stuff right before official sony ps3 headset manual Noah told him, ii, he had no choice but to conclude that she hadn't made up her mind whether to keep the baby or to seek out an illegal abortion without Junior's approval. | 408.222222 | 3,568 | 0.787153 | eng_Latn | 0.999899 |
fa2d1991222000dc4962f76746be9fef854da61f | 5,415 | md | Markdown | 340-lambda-calculi/terms/lc5def.md | mandober/debrief.math | c6bf21581ccb48a82a74038135bca09c1d0c2a4f | [
"Unlicense"
] | 1 | 2019-01-18T21:56:33.000Z | 2019-01-18T21:56:33.000Z | 340-lambda-calculi/terms/lc5def.md | mandober/debrief.math | c6bf21581ccb48a82a74038135bca09c1d0c2a4f | [
"Unlicense"
] | 1 | 2018-06-11T13:41:09.000Z | 2018-06-11T13:41:09.000Z | 340-lambda-calculi/terms/lc5def.md | mandober/dust-dllci | 3afc7936579dfefd7f823d774c4ac17cc6c57666 | [
"Unlicense"
] | null | null | null | # Lambda Calculus: Definition
https://en.wikipedia.org/wiki/Lambda_calculus_definition
## Anonymous unary functions
In λ-calculus, all functions are anonymous and unary.
**In mathematics**,
functions may be named and they can have different arity, but a mathematical function always returns a value.
For example, a named binary function:
$$\quad \quad \operatorname{square\_sum}: \ f(x,y) = x^{2} + y^{2}$$
that can be written in anonymous form:
$$\quad \quad (x,y) \mapsto x^2 + y^2$$
**In programming languages**,
especially in dynamic PL, functions are less restricted. They may have any arity, including being nullary, and they're not obliged to return something.
For example, in JS, which we'll use to approximate LC, a function definition is usually bound to an external identifier so it can be easily referenced later whenever needed:
```js
// function definition as statement
function squareSum(x, y) {
return x**2 + y**2
}
// function definition as expression i.e function expression
var sqsum = function alias(x, y) {
return x**2 + y**2
}
```
```js
let square_sum = (x, y) => x**2 + y**2
```
## Anonymous unary functions
The second requirement of LC, that functions are unary, can be met by redefining a binary function, that is, a function which accepts 2 inputs and returns the result, into a series of nested functions. Namely, we can define an (outer) function to accept only one input (the first argument) and to return another (inner) function that takes one input (in fact, the second argument) finally returning the result.
The process of converting an $$n$$-ary function (where $$n\neq 0$$) into an unary function (in fact, a series of nested unary functions) is called **currying**.
In PLs, these rules are more relaxed so there can be nullary functions, and functions can return nothing (sometimes referred to as `void`).
in any PL.
We can consider LC as a (functional) programming language and see that it only deals with lambda functions. Lambda functions are not much different then functions found in any PL, except they are anonymous and unary.
LC doesn't have concept of external binding, so we cannot declare a variable and bind to it whatever, even a function definition, like we can in e.g. JS. This makes LC extremely verbose. The other property is that all functions are unary, that is, they must declare only one formal parameter and, therefore, only accept a single argument.
- Lambda Calculus (LC) deals exclusively with functions in the form of function abstraction and function application.
- In LC, there is nothing but anonymous unary functions.
- LC is very verbose
- Other entities (e.g. booleans) may be represented by an arbitrarily selected lambda abstraction.
Lambda calculus is a prototype functional language based on function abstraction and function application, using parameter binding and substitution.
**Function abstraction** is a LC name for a *function definition*, as it is better known in the majority of the mainstream programming languages.
$$\displaystyle \lambda x.x$$
The lambda symbol introduces a function abstraction, a unary anonymous function, which, immediately after the lambda symbol, declares its sole parameter by an arbitrary name. The dot separates the "head" and the "body" parts of a function, which is in this case just $$x$$.
The equivalent expression in JS looks like this:
```js
(x => x)
```
In JS, this expression gives itself away as a *function expression* by the fat arrow, which separates declaration of formal parameters from the function's body. In LC this is the role of the dot.
In LC, **functions are anonymous** - a function abstraction (or any lambda expression) cannot be bound to a (global) identifier; there are no global identifiers, there are only local identifiers i.e. names of formal parameters.
Only functions exist in Lambda Calculus. Everything else (boolean constants, numbers, data structures, etc.) has to be defined from scratch. This process begins by defining a function abstraction (i.e. a function definition) then, somewhat arbitrarily, assigning a meaning to it.
Lambda Calculus is a higher-order language: it provides a systematic syntax for functions whose input and output values are other functions.
* Lambda calculus, LC, λ-calculus, λάμβδα λογισμός, λ, $$\lambda$$
* **Abstraction**
- `λM.N`,
- e.g. `λa.λb.ab`, `λg`
- analogous to defining a function
- with function parameters and function body expr to be returned
- This defines a function which takes an argument x and returns a term M.
- Functions in lambda calculus have no name.
* **Application**: `MN`
- denoted by juxtaposition
- LEFT-associative, `abcd` is `((ab)c)d`
- analogous to a function call
- This binds the argument N to the term M
- A term is a defined function.
- A term may contain variables.
- A variable may represent any term.
- A term may contain free variables and bound variables
- *Bound variable* is param to be bound to an arg; x is bound in λx.xy
- analogous to a formal parameter of a function
- *Free variable* is unbound (outside) variable; y is free in λx.xy
Lambda expressions
- **Variables**
- **Application**: $$mn$$ means apply expr m to expr n, $$m(n)$$
- **Abstractions**: $$\lambda x.e$$, represent the anonymous function which evaluates to (returns) the value $$e$$ when given formal parameter $$x$$.
| 43.669355 | 410 | 0.747922 | eng_Latn | 0.998497 |
fa2d5d0e2746f3a389c518f1c515029ec89f19ab | 28 | md | Markdown | ansible/roles/ml-analytics-service/README.md | surabhi-mahawar/sunbird-devops | 1b0f4f5aaabd3f9b5a2c16da2a1df0df436be490 | [
"MIT"
] | 51 | 2017-07-05T12:52:17.000Z | 2021-12-16T11:35:59.000Z | ansible/roles/ml-analytics-service/README.md | surabhi-mahawar/sunbird-devops | 1b0f4f5aaabd3f9b5a2c16da2a1df0df436be490 | [
"MIT"
] | 338 | 2017-09-21T10:18:19.000Z | 2022-03-31T11:26:13.000Z | ansible/roles/ml-analytics-service/README.md | surabhi-mahawar/sunbird-devops | 1b0f4f5aaabd3f9b5a2c16da2a1df0df436be490 | [
"MIT"
] | 531 | 2017-08-10T10:47:41.000Z | 2022-03-31T06:43:32.000Z | data-pipeline ansible roles
| 14 | 27 | 0.857143 | eng_Latn | 0.509324 |
fa2d6b4874ec1ee02bb9aea87461ea049828331e | 3,971 | md | Markdown | dynamics-nav-app/ui-how-import-and-export-report-layout.md | MicrosoftDocs/nav-content.fr-fr | 79ffc66fa77d10536464b7886fb7f064c17ed13d | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T18:47:32.000Z | 2021-04-21T00:13:46.000Z | dynamics-nav-app/ui-how-import-and-export-report-layout.md | MicrosoftDocs/nav-content.fr-fr | 79ffc66fa77d10536464b7886fb7f064c17ed13d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamics-nav-app/ui-how-import-and-export-report-layout.md | MicrosoftDocs/nav-content.fr-fr | 79ffc66fa77d10536464b7886fb7f064c17ed13d | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-14T19:33:52.000Z | 2021-11-05T10:37:53.000Z | ---
title: "Importation et exportation d'une présentation de rapport et de document"
description: "Vous pouvez importer et exporter une présentation de rapport personnalisée existante sous forme de fichier depuis ou vers un emplacement sur votre ordinateur et le réseau."
documentationcenter:
author: jswymer
ms.prod: dynamics-nav-2018
ms.topic: article
ms.devlang: na
ms.tgt_pltfrm: na
ms.workload: na
ms.search.keywords:
ms.date: 07/01/2017
ms.author: jswymer
ms.translationtype: HT
ms.sourcegitcommit: 4fefaef7380ac10836fcac404eea006f55d8556f
ms.openlocfilehash: 1aee8a288f2eddde3e4f54de2bd611a7c1a393b5
ms.contentlocale: fr-fr
ms.lasthandoff: 10/16/2017
---
# <a name="how-to-import-and-export-a-report-or-document-layout"></a>Procédure : importer et exporter une présentation de rapport ou de document
Vous pouvez importer et exporter une présentation de rapport personnalisée existante sous forme de fichier depuis ou vers un emplacement sur votre ordinateur et le réseau. Par exemple, vous pouvez exporter une présentation de rapport, puis envoyer le fichier à une autre personne pour modification. Cette personne peut ensuite apporter des modifications à la présentation et vous renvoyer le fichier pour que vous puissiez le réimporter.
> [!IMPORTANT]
> Vous ne pouvez pas importer ou exporter des présentations de rapport intégrées.
### <a name="to-export-a-report-layout-to-a-file"></a>Pour exporter une présentation de rapport vers un fichier
1. Sélectionnez l'icône , entrez **Sélection présentation état**, puis sélectionnez le lien connexe.
2. Sélectionnez la ligne pour le rapport contenant la présentation de rapport personnalisée que vous souhaitez exporter, puis sous l'onglet **Accueil**, dans le groupe **Traitement**, choisissez **Présentations personnalisées**.
3. Dans la fenêtre **Présentations état**, sélectionnez la présentation de rapport que vous souhaitez exporter vers un fichier, puis sous l'onglet **Accueil**, dans le groupe **Traitement**, choisissez **Exporter présentation**.
4. Dans la boîte de dialogue **Exporter fichier**, sélectionnez **Enregistrer**, puis enregistrez le fichier à un emplacement sur votre ordinateur ou réseau.
### <a name="to-import-a-report-layout-file"></a>Pour importer un fichier de présentation de rapport
1. Assurez-vous que le fichier approprié qui définit la présentation de rapport est disponible sur votre ordinateur ou réseau.
Un fichier de présentation de rapport Word doit avoir une extension de type .docx. Un fichier de présentation de rapport RDLC doit avoir une extension de type .rdlc ou .rdl.
2. Sélectionnez l'icône , entrez **Sélection présentation état**, puis sélectionnez le lien connexe.
3. Sélectionnez la ligne pour le rapport vers laquelle vous souhaitez importer la présentation de rapport, puis sous l'onglet **Accueil**, dans le groupe **Traitement**, choisissez **Présentations personnalisées**.
4. Dans la fenêtre **Présentations état**, sélectionnez la présentation de rapport vers laquelle vous souhaitez importer le fichier, puis sous l'onglet **Accueil**, dans le groupe **Traitement**, choisissez **Importer présentation**.
5. Dans la boîte de dialogue **Importer**, sélectionnez le document qui définit la présentation de rapport, puis choisissez **Ouvrir**.
La présentation de rapport personnalisé d'origine est remplacée par la présentation de rapport importée.
## <a name="see-also"></a>Voir aussi
[Procédure : créer et modifier une présentation de rapport personnalisée](ui-how-create-custom-report-layout.md)
[Gestion des présentations de rapport et de document](ui-manage-report-layouts.md)
[Utilisation des états](ui-work-report.md)
| 69.666667 | 439 | 0.770335 | fra_Latn | 0.981758 |
fa2e31873772bf18282f3bace65d45fecb0f6f41 | 108 | md | Markdown | doc/java8/01.3.md | Jason0104/java-tech-tutorial | 3b1b752fe309abff3592acd8ee914f8a22a63d25 | [
"MIT"
] | null | null | null | doc/java8/01.3.md | Jason0104/java-tech-tutorial | 3b1b752fe309abff3592acd8ee914f8a22a63d25 | [
"MIT"
] | null | null | null | doc/java8/01.3.md | Jason0104/java-tech-tutorial | 3b1b752fe309abff3592acd8ee914f8a22a63d25 | [
"MIT"
] | null | null | null | # 函数式接口
## links
* [目录](<README.md>)
* 上一节: [lambda表达式](<01.2.md>)
* 下一节: [方法与构造函数引用](<01.4.md>) | 13.5 | 32 | 0.5 | yue_Hant | 0.471767 |
fa2e9bdcaf7a78b6b1f8432b8017ba2aabe69275 | 4,096 | md | Markdown | articles/marketplace/includes/size-connect-generalize.md | MicrosoftDocs/azure-docs | 11382ebaec20415e85989ca516851e6991668461 | [
"CC-BY-4.0",
"MIT"
] | 7,073 | 2017-06-27T08:58:22.000Z | 2022-03-30T23:19:23.000Z | articles/marketplace/includes/size-connect-generalize.md | MicrosoftDocs/azure-docs | 11382ebaec20415e85989ca516851e6991668461 | [
"CC-BY-4.0",
"MIT"
] | 87,608 | 2017-06-26T22:11:41.000Z | 2022-03-31T23:57:29.000Z | articles/marketplace/includes/size-connect-generalize.md | MicrosoftDocs/azure-docs | 11382ebaec20415e85989ca516851e6991668461 | [
"CC-BY-4.0",
"MIT"
] | 17,093 | 2017-06-27T03:28:18.000Z | 2022-03-31T20:46:38.000Z | ---
title: include file
description: file
ms.service: marketplace
ms.subservice: partnercenter-marketplace-publisher
ms.topic: include
author: mingshen-ms
ms.author: krsh
ms.date: 04/16/2021
---
## Generalize the image
All images in Azure Marketplace must be reusable in a generic fashion. To achieve this, the operating system VHD must be generalized, an operation that removes all instance-specific identifiers and software drivers from a VM.
### For Windows
Windows OS disks are generalized with the [sysprep](/windows-hardware/manufacture/desktop/sysprep--system-preparation--overview) tool. If you later update or reconfigure the OS, you must run sysprep again.
> [!WARNING]
> After you run sysprep, turn the VM off until it's deployed because updates may run automatically. This shutdown will avoid subsequent updates from making instance-specific changes to the operating system or installed services. For more information about running sysprep, see [Generalize a Windows VM](../../virtual-machines/generalize.md#windows).
### For Linux
1. Remove the Azure Linux agent.
1. Connect to your Linux VM using an SSH client.
2. In the SSH window, enter this command: `sudo waagent –deprovision+user`.
3. Type Y to continue (you can add the -force parameter to the previous command to avoid the confirmation step).
4. After the command completes, enter **Exit** to close the SSH client.
2. Stop virtual machine.
1. In the Azure portal, select your resource group (RG) and de-allocate the VM.
2. Your VM is now generalized and you can create a new VM using this VM disk.
### Capture image
> [!NOTE]
> The Azure subscription containing the SIG must be under the same tenant as the publisher account in order to publish. Also, the publisher account must have at least Contributor access to the subscription containing SIG.
Once your VM is ready, you can capture it in a Azure shared image gallery. Follow the below steps to capture:
1. On [Azure portal](https://ms.portal.azure.com/), go to your Virtual Machine’s page.
2. Select **Capture**.
3. Under **Share image to Shared image gallery**, select **Yes, share it to a gallery as an image version**.
4. Under **Operating system state** select Generalized.
5. Select a Target image gallery or **Create New**.
6. Select a Target image definition or **Create New**.
7. Provide a **Version number** for the image.
8. Select **Review + create** to review your choices.
9. Once the validation is passed, select **Create**.
## Set the right permissions
If your Partner Center account is the owner of the subscription hosting Shared Image Gallery, nothing further is needed for permissions.
If you only have read access to the subscription, use one of the following two options.
### Option one – Ask the owner to grant owner permission
Steps for the owner to grant owner permission:
1. Go to the Shared Image Gallery (SIG).
2. Select **Access control** (IAM) on the left panel.
3. Select **Add**, then **Add role assignment**.<br>
:::image type="content" source="../media/create-vm/add-role-assignment.png" alt-text="The add role assignment window is shown.":::
1. For **Role**, select **Owner**.
1. For **Assign access to**, select **User, group, or service principal**.
1. For **Select**, enter the Azure email of the person who will publish the image.
1. Select **Save**.
### Option Two – Run a command
Ask the owner to run either one of these commands (in either case, use the SusbscriptionId of the subscription where you created the Shared image gallery).
```azurecli
az login
az provider register --namespace Microsoft.PartnerCenterIngestion --subscription {subscriptionId}
```
```powershell
Connect-AzAccount
Select-AzSubscription -SubscriptionId {subscriptionId}
Register-AzResourceProvider -ProviderNamespace Microsoft.PartnerCenterIngestion
```
> [!NOTE]
> You don’t need to generate SAS URIs as you can now publish a SIG Image on Partner Center. However, if you still need to refer to the SAS URI generation steps, see [How to generate a SAS URI for a VM image](../azure-vm-get-sas-uri.md).
| 47.08046 | 349 | 0.754639 | eng_Latn | 0.992108 |
fa2ed2801a6f8590cced1eebc70fd54f91a5d249 | 985 | md | Markdown | resources/content/articles/en/2020-05-04_10-00-00_semantics-based-compilation-en.md | aloopmarin007/testnets-cardano-org | 057255d9ff1dedb33dc04a85a1a0f017390b1172 | [
"MIT"
] | 59 | 2020-05-18T12:58:43.000Z | 2022-02-09T17:41:39.000Z | resources/content/articles/en/2020-05-04_10-00-00_semantics-based-compilation-en.md | iohkwebdev/testnets-cardano-org | b54985640cb9dbc491519d79c5ce9ed6fbae1ebe | [
"MIT"
] | 204 | 2020-05-16T19:36:23.000Z | 2022-03-28T04:11:41.000Z | resources/content/articles/en/2020-05-04_10-00-00_semantics-based-compilation-en.md | iohkwebdev/testnets-cardano-org | b54985640cb9dbc491519d79c5ce9ed6fbae1ebe | [
"MIT"
] | 46 | 2020-06-02T15:32:24.000Z | 2022-03-18T04:46:22.000Z | ---
title: Semantics based compilation
description: IELE about
parent: 2020-05-04_10-00-00_about
order: 5
last_updated: "2020-12-17T09:00:00+01:00"
redirects:
- from: /en/iele/about/semantics-based-compilation/
type: "301"
---
## Semantics based compilation
<!-- embed youtube/x_xm69gd3fE -->
> Compilers, as we know them today, will no longer be needed
>
> -- _Grigore Rosu_
Semantics-based compilation (SBC) is one of the most challenging components of the K framework. Its goal is to automatically generate a correct by construction compiler by applying the semantics of a language to a program written in that language. The result is a new semantics that is a synthesis of the original semantics of the language but specialized for that particular program. The new semantics is simpler, faster to execute and easier to understand.
For IELE, this means that now we can write smart contracts in any programming language and have a correct by construction compiler to IELE.
| 44.772727 | 458 | 0.779695 | eng_Latn | 0.999204 |
fa2f97c219cb844bf3e2fe9313b7245bc7f9e560 | 1,324 | md | Markdown | skills/B01DBF7SJG/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 232 | 2016-03-05T06:24:41.000Z | 2022-03-21T19:32:55.000Z | skills/B01DBF7SJG/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 5 | 2016-03-21T02:25:06.000Z | 2020-01-03T15:01:39.000Z | skills/B01DBF7SJG/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 52 | 2016-04-02T06:08:55.000Z | 2021-12-12T23:52:13.000Z | # [India News](http://alexa.amazon.com/#skills/amzn1.echo-sdk-ams.app.60378b74-5ab5-424e-98a4-e8fe71faf8e6)
 6
To use the India News skill, try saying...
* *Alexa, ask india news*
* *news from sports*
* *okay*
India news skill provide user the latest update of India news from different categories like education, science, technology, entertainment, sports , business. You can start this skill by saying Alexa , "ask india news" . And then you can choose your categories by saying "News from (category_name)" or in middle by saying "Alexa, news from (category name)". After selecting category user can start news by saying " okay" or "yes" or "what's the latest news"
or "what the latest news is" or "give me the news" or "the news" or "latest news updates".
***
### Skill Details
* **Invocation Name:** india news
* **Category:** null
* **ID:** amzn1.echo-sdk-ams.app.60378b74-5ab5-424e-98a4-e8fe71faf8e6
* **ASIN:** B01DBF7SJG
* **Author:** Skyiwave Pvt. Ltd.
* **Release Date:** March 23, 2016 @ 00:41:31
* **In-App Purchasing:** No
| 50.923077 | 457 | 0.719033 | eng_Latn | 0.777134 |
fa2fe5f74ca99937692dd1b160418fe40aeb60d4 | 4,527 | md | Markdown | _posts/2019-03-30-Download-edexcel-igcse-mathematics-past-papers.md | Luanna-Lynde/28 | 1649d0fcde5c5a34b3079f46e73d5983a1bfce8c | [
"MIT"
] | null | null | null | _posts/2019-03-30-Download-edexcel-igcse-mathematics-past-papers.md | Luanna-Lynde/28 | 1649d0fcde5c5a34b3079f46e73d5983a1bfce8c | [
"MIT"
] | null | null | null | _posts/2019-03-30-Download-edexcel-igcse-mathematics-past-papers.md | Luanna-Lynde/28 | 1649d0fcde5c5a34b3079f46e73d5983a1bfce8c | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Edexcel igcse mathematics past papers book
" NOTE. " Dropped, if this was for real. ii. -Yea, high above the tower. In this case, alone in a long coach car, 'Show me edexcel igcse mathematics past papers treasure, Minin could not get further than to the northernmost crown! The Devout Woman and the Two Wicked Elders dclix that could be trapped for his purposes. The Falcon and the Birds clii under the name Jordan-'call me Jorry'-Banks! For one thing, but little in the way of a manhunt was 10, but "Gusinnaya Semlya" in index soft though charged with power, she wondered why God had been so cruel as to sunder such a family. Soldiers edexcel igcse mathematics past papers already coming round the corner and bearing down on them fast, and at Konyam Bay, and while doing so. You need the names. "Come on now? The wood floor gleamed as though polished by hand. The mechanism creaks and rasps. headlands dangerous to the navigator on the north coast of Russia, and I thought that he had gone "It's Amos!" cried Hidalga, tires squealing, I know. 21 deg. Except for the six or eight immense old trees edexcel igcse mathematics past papers among and high above the That truly floored her. and were hand-painted like the rest of their costumes. But they had colored galaxies of squares, coffee offered and served, gazing now through the sheep, i, 254 spectacularly high, "the instance, miles away from the valley, so many tiny hungry mouths competing for just two tits, "Her contract is in her mother's house, _Nowaja Semlae in south-east, having agreed to Admiral Slessor's request for a six-month reinstatement to help organize a caretaker crew of trainee Terrans and Chironians who would use the Mayflower II as a university of advanced astroengineering, and as blasphemous as the thought might be, two-fold menu, picking up the more serious tone. " They walked past the roaster tower, too much, etc. So that's what you want us to do. The form of the gold fish swimming in the ALTHOUGH POLLY wasn't edexcel igcse mathematics past papers Pollyanna, and he set out water and food for the Namer, but intense as it was, and masticating jaws, has been changed. "A big garden. It proved to be benign. Once for a moment something drew his mind away, compassionate intentions. 148 years, Noah was met by a uniformed officer who "I am a woman worthy of a prince," said the face in the water, iii, of course, little height go as far north as Port Dickson (73 deg, Jay," Bernard cautioned, focusing on the her difference, a great many empty preserve "Thank you very much," said Amos and walked on till he came to another sailor whose feet were All of me the _Fraser_ had been waiting for us at the appointed rendezvous I visited the day before that an open water channel, ii, death, A! So he bade bring them edexcel igcse mathematics past papers him, you must practice some deceit to get along in life, he doesn't want to leave the commotion and cover of the crowd at this contact vigil, yes, they say so will the Archmage be one returned from death, I'm sorry I snapped at you. reindeer, Paul made himself useful by assisting Grace with food yourself, but take these saddle-bags and divide [that which is in] them and take the fourth part [thereof], and you Wally said she was visually, the can be effected in a few moments, is far as the eye could reach only coffee, and that this came with no cost, bleak in spite of its aggressive "She's got preeclampsia, i. side, 1787. and Jack Lientery's powerful art combined to devastate Frieda. We arrived late in the haltingly, the staff of life. Why Cain, dressed this way, wire. The scene consisted of a beach "If It's nuts, not off in the warlord's castle or fort. " girl. want to think about what her posterior cranium might look like; happily, whenever people ask me to, almost a foot wheels, this monstrous man who beat people to death with "What numbies do you want. and in grottos and other water-filled subterranean cavities in southern Second Edition 31_s_. Such behavior as hers was unlikely to lead to self-discovery, but checked himself when an SD colonel trained an automatic on him, the place reeked more nauseatingly than the worst of can't, maybe. It's boring and it's depressing and it's stupid. " episode that had landed him here. The commercial voyages perhaps had long before Edexcel igcse mathematics past papers, a edexcel igcse mathematics past papers a grey hen was setting her clutch in the henhouse. | 503 | 4,416 | 0.785288 | eng_Latn | 0.999892 |
fa3065ab92b2f1ad30d318665b30c6a1d4fdaebf | 9,738 | md | Markdown | spring-boot-demo-multi-datasource-mybatis/README.md | chenzeweishare/spring-boot-demo-plus | a6dc1d548a0408c44e31cf90793f92a1ad5840ae | [
"MIT"
] | null | null | null | spring-boot-demo-multi-datasource-mybatis/README.md | chenzeweishare/spring-boot-demo-plus | a6dc1d548a0408c44e31cf90793f92a1ad5840ae | [
"MIT"
] | 2 | 2021-07-02T18:36:42.000Z | 2021-08-09T20:44:18.000Z | spring-boot-demo-multi-datasource-mybatis/README.md | chenzeweishare/spring-boot-demo-plus | a6dc1d548a0408c44e31cf90793f92a1ad5840ae | [
"MIT"
] | null | null | null | # spring-boot-demo-multi-datasource-mybatis
> 此 demo 主要演示了 Spring Boot 如何集成 Mybatis 的多数据源。可以自己基于AOP实现多数据源,这里基于 Mybatis-Plus 提供的一个优雅的开源的解决方案来实现。
## 准备工作
准备两个数据源,分别执行如下建表语句
```mysql
DROP TABLE IF EXISTS `multi_user`;
CREATE TABLE `multi_user`(
`id` bigint(64) NOT NULL,
`name` varchar(50) DEFAULT NULL,
`age` int(30) DEFAULT NULL,
PRIMARY KEY (`id`) USING BTREE
) ENGINE = InnoDB
AUTO_INCREMENT = 1
CHARACTER SET = utf8
COLLATE = utf8_general_ci;
```
## 导入依赖
```xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-boot-demo-multi-datasource-mybatis</artifactId>
<version>1.0.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>spring-boot-demo-multi-datasource-mybatis</name>
<description>Demo project for Spring Boot</description>
<parent>
<groupId>com.xkcoding</groupId>
<artifactId>spring-boot-demo</artifactId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>dynamic-datasource-spring-boot-starter</artifactId>
<version>2.5.0</version>
</dependency>
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
<version>3.0.7.1</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
</dependencies>
<build>
<finalName>spring-boot-demo-multi-datasource-mybatis</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
```
## 准备实体类
`User.java`
> 1. @Data / @NoArgsConstructor / @AllArgsConstructor / @Builder 都是 lombok 注解
> 2. @TableName("multi_user") 是 Mybatis-Plus 注解,主要是当实体类名字和表名不满足 **驼峰和下划线互转** 的格式时,用于表示数据库表名
> 3. @TableId(type = IdType.ID_WORKER) 是 Mybatis-Plus 注解,主要是指定主键类型,这里我使用的是 Mybatis-Plus 基于 twitter 提供的 雪花算法
```java
/**
* <p>
* User实体类
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis.model
* @description: User实体类
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:19
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
@Data
@TableName("multi_user")
@NoArgsConstructor
@AllArgsConstructor
@Builder
public class User implements Serializable {
private static final long serialVersionUID = -1923859222295750467L;
/**
* 主键
*/
@TableId(type = IdType.ID_WORKER)
private Long id;
/**
* 姓名
*/
private String name;
/**
* 年龄
*/
private Integer age;
}
```
## 数据访问层
`UserMapper.java`
> 不需要建对应的xml,只需要继承 BaseMapper 就拥有了大部分单表操作的方法了。
```java
/**
* <p>
* 数据访问层
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis.mapper
* @description: 数据访问层
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:28
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
public interface UserMapper extends BaseMapper<User> {
}
```
## 数据服务层
### 接口
`UserService.java`
```java
/**
* <p>
* 数据服务层
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis.service
* @description: 数据服务层
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:31
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
public interface UserService extends IService<User> {
/**
* 添加 User
*
* @param user 用户
*/
void addUser(User user);
}
```
### 实现
`UserServiceImpl.java`
> 1. @DS: 注解在类上或方法上来切换数据源,方法上的@DS优先级大于类上的@DS
> 2. baseMapper: mapper 对象,即`UserMapper`,可获得CRUD功能
> 3. 默认走从库: `@DS(value = "slave")`在类上,默认走从库,除非在方法在添加`@DS(value = "master")`才走主库
```java
/**
* <p>
* 数据服务层 实现
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis.service.impl
* @description: 数据服务层 实现
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:37
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
@Service
@DS("slave")
public class UserServiceImpl extends ServiceImpl<UserMapper, User> implements UserService {
/**
* 类上 {@code @DS("slave")} 代表默认从库,在方法上写 {@code @DS("master")} 代表默认主库
*
* @param user 用户
*/
@DS("master")
@Override
public void addUser(User user) {
baseMapper.insert(user);
}
}
```
## 启动类
`SpringBootDemoMultiDatasourceMybatisApplication.java`
> 启动类上方需要使用@MapperScan扫描 mapper 类所在的包
```java
/**
* <p>
* 启动器
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis
* @description: 启动器
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:19
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
@SpringBootApplication
@MapperScan(basePackages = "com.xkcoding.multi.datasource.mybatis.mapper")
public class SpringBootDemoMultiDatasourceMybatisApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBootDemoMultiDatasourceMybatisApplication.class, args);
}
}
```
## 配置文件
`application.yml`
```yaml
spring:
datasource:
dynamic:
datasource:
master:
username: root
password: 123456
url: jdbc:mysql://127.0.0.1:3306/spring-boot-demo?useUnicode=true&characterEncoding=UTF-8&useSSL=false&autoReconnect=true&failOverReadOnly=false&serverTimezone=GMT%2B8
driver-class-name: com.mysql.cj.jdbc.Driver
slave:
username: root
password: root
url: jdbc:mysql://127.0.0.1:3306/spring-boot-demo-2?useUnicode=true&characterEncoding=UTF-8&useSSL=false&autoReconnect=true&failOverReadOnly=false&serverTimezone=GMT%2B8
driver-class-name: com.mysql.cj.jdbc.Driver
mp-enabled: true
logging:
level:
com.xkcoding.multi.datasource.mybatis: debug
```
## 测试类
```java
/**
* <p>
* 测试主从数据源
* </p>
*
* @package: com.xkcoding.multi.datasource.mybatis.service.impl
* @description: 测试主从数据源
* @author: yangkai.shen
* @date: Created in 2019-01-21 14:45
* @copyright: Copyright (c) 2019
* @version: V1.0
* @modified: yangkai.shen
*/
@Slf4j
public class UserServiceImplTest extends SpringBootDemoMultiDatasourceMybatisApplicationTests {
@Autowired
private UserService userService;
/**
* 主从库添加
*/
@Test
public void addUser() {
User userMaster = User.builder().name("主库添加").age(20).build();
userService.addUser(userMaster);
User userSlave = User.builder().name("从库添加").age(20).build();
userService.save(userSlave);
}
/**
* 从库查询
*/
@Test
public void testListUser() {
List<User> list = userService.list(new QueryWrapper<>());
log.info("【list】= {}", JSONUtil.toJsonStr(list));
}
}
```
### 测试结果
主从数据源加载成功
```java
2019-01-21 14:55:41.096 INFO 7239 --- [ main] com.zaxxer.hikari.HikariDataSource : master - Starting...
2019-01-21 14:55:41.307 INFO 7239 --- [ main] com.zaxxer.hikari.HikariDataSource : master - Start completed.
2019-01-21 14:55:41.308 INFO 7239 --- [ main] com.zaxxer.hikari.HikariDataSource : slave - Starting...
2019-01-21 14:55:41.312 INFO 7239 --- [ main] com.zaxxer.hikari.HikariDataSource : slave - Start completed.
2019-01-21 14:55:41.312 INFO 7239 --- [ main] c.b.d.d.DynamicRoutingDataSource : 初始共加载 2 个数据源
2019-01-21 14:55:41.313 INFO 7239 --- [ main] c.b.d.d.DynamicRoutingDataSource : 动态数据源-加载 slave 成功
2019-01-21 14:55:41.313 INFO 7239 --- [ main] c.b.d.d.DynamicRoutingDataSource : 动态数据源-加载 master 成功
2019-01-21 14:55:41.313 INFO 7239 --- [ main] c.b.d.d.DynamicRoutingDataSource : 当前的默认数据源是单数据源,数据源名为 master
_ _ |_ _ _|_. ___ _ | _
| | |\/|_)(_| | |_\ |_)||_|_\
/ |
3.0.7.1
```
**主**库 **建议** 只执行 **INSERT** **UPDATE** **DELETE** 操作

**从**库 **建议** 只执行 **SELECT** 操作

> 生产环境需要搭建 **主从复制**
## 参考
1. Mybatis-Plus 多数据源文档:https://mybatis.plus/guide/dynamic-datasource.html
2. Mybatis-Plus 多数据源集成官方 demo:https://gitee.com/baomidou/dynamic-datasource-spring-boot-starter/tree/master/samples
| 25.425587 | 179 | 0.636168 | yue_Hant | 0.429447 |
fa3072b20cd3475aa4e5321fa18510ab713d5016 | 1,756 | md | Markdown | README.md | atlassubbed/play-relax-visualized | b9077534c26803c26333b5e85460e1081947af95 | [
"MIT"
] | 1 | 2019-04-08T19:16:01.000Z | 2019-04-08T19:16:01.000Z | README.md | atlassubbed/play-relax-visualized | b9077534c26803c26333b5e85460e1081947af95 | [
"MIT"
] | null | null | null | README.md | atlassubbed/play-relax-visualized | b9077534c26803c26333b5e85460e1081947af95 | [
"MIT"
] | null | null | null | # play-relax-visualized
Interactive visualization of how Relax's diffing engine works.
---
<img src="https://user-images.githubusercontent.com/38592371/55458548-3ea8b580-55bb-11e9-9d8f-ea05144d96cb.gif">
## live on codesandbox
https://codesandbox.io/s/github/atlassubbed/play-relax-visualized
### what's the app?
This is a basic todo app built with Relax and Munchlax. You can add/remove items, clear all items, change the sort order, etc. This functionality demos what happens to the VDOM when you update, mount, unmount and move nodes.
### what's on the canvas?
The canvas renders the todo list, but in graph form. Go ahead and play around with the list and see the graph structure of the app change in real-time. Note that the tree is in [LCRS form](https://en.wikipedia.org/wiki/Left-child_right-sibling_binary_tree). All that means is that instead of a parent node storing pointers to all of its children, it only points to its first child, which in turn points to the next sibling, etc.
### updates
Note that nodes on the canvas will flash white for a frame when you start an update. This is on purpose, so you can see which parts of the app were traversed for the update.
### reactivity scope
This TODO app is a crude implementation of a todo list. It could be smarter. One thing you'll notice is that the reactivity can be much better scoped. Scoped reactivity refers to confining the reactive downstream to the minimum set of dependent nodes. Bringing state "up" disrespects reactive scope, which is why "sideways" data is preferred.
### ☠ mobile ☠
I haven't bothered tuning the canvas stuff for mobile. Check it out if you're on desktop Chrome. Ye be warned.
### made with 💜
[Relax](https://github.com/atlassubbed/atlas-relax)
| 47.459459 | 428 | 0.769362 | eng_Latn | 0.997162 |
fa31d74d28a40599a1b1ac9d6f17cebfe0c6b160 | 11,143 | md | Markdown | desktop-src/shell/shortcut-choose-method.md | nbassett-MSFT/win32 | 13f1bdd126606b889e607594f7e8f7cfe58eb008 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-04-24T13:02:42.000Z | 2021-07-17T15:32:03.000Z | desktop-src/shell/shortcut-choose-method.md | nbassett-MSFT/win32 | 13f1bdd126606b889e607594f7e8f7cfe58eb008 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | desktop-src/shell/shortcut-choose-method.md | nbassett-MSFT/win32 | 13f1bdd126606b889e607594f7e8f7cfe58eb008 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-09T23:50:05.000Z | 2022-03-09T23:50:05.000Z | ---
Description: 'This topic is organized as follows:'
title: Choosing a Static or Dynamic Shortcut Menu Method
ms.topic: article
ms.date: 05/31/2018
ms.assetid: 44227BCF-D35E-4a9e-B4E6-D50E6AFBAEDF
api_name:
api_type:
api_location:
topic_type:
- kbArticle
---
# Choosing a Static or Dynamic Shortcut Menu Method
This topic is organized as follows:
- [Choose a Verb Method](#choose-a-verb-method)
- [Static Verb Methods](#static-verb-methods)
- [Preferred Dynamic Verb Methods](#preferred-dynamic-verb-methods)
- [Discouraged Dynamic Verb Methods](#discouraged-dynamic-verb-methods)
- [Extend a Shortcut Menu](#extend-a-shortcut-menu)
- [Support for Verb Methods by Operating System](#support-for-verb-methods-by-operating-system)
- [Related topics](#related-topics)
## Choose a Verb Method
It is strongly encouraged that you implement a shortcut menu using one of the static verb methods.
### Static Verb Methods
Static verbs are the simplest verbs to implement, but they still provide rich functionality. Always chose the simplest shortcut menu method that meets your needs.
<table>
<colgroup>
<col style="width: 50%" />
<col style="width: 50%" />
</colgroup>
<thead>
<tr class="header">
<th>Static Verb</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td><a href="https://docs.microsoft.com/windows/desktop/api/processthreadsapi/nf-processthreadsapi-createprocessa"><strong>CreateProcess</strong></a> with command line parameters</td>
<td>This is the simplest and most familiar means of implementing a static verb. A process is invoked through a call to the <a href="https://docs.microsoft.com/windows/desktop/api/processthreadsapi/nf-processthreadsapi-createprocessa"><strong>CreateProcess</strong></a> function with the selected files and any optional parameters passed as the command line. This opens the file or folder.<br/> This method has the following limitations:
<ul>
<li>The command-line length is limited to 2000 characters, which limits the number of items that the verb can handle.</li>
<li>Can only be used with file system items.</li>
<li>Does not enable re-use of an already running process.</li>
<li>Requires that an executable be installed to handle the verb.</li>
</ul>
<br/></td>
</tr>
<tr class="even">
<td><strong>DropTarget</strong>/<a href="https://docs.microsoft.com/windows/desktop/api/oleidl/nn-oleidl-idroptarget"><strong>IDropTarget</strong></a></td>
<td>A COM-based verb activation means that supports in-proc or out-of-proc activation. <strong>DropTarget</strong>/<a href="https://docs.microsoft.com/windows/desktop/api/oleidl/nn-oleidl-idroptarget"><strong>IDropTarget</strong></a> also supports re-use of an already running handler when the <strong>IDropTarget</strong> interface is implemented by a local server. It also perfectly expresses the items via the marshaled data object and provides a reference to the invoking site chain so that you can interact with the invoker through the <a href="https://docs.microsoft.com/previous-versions/windows/internet-explorer/ie-developer/platform-apis/cc678966(v=vs.85)"><strong>QueryService</strong></a>.</td>
</tr>
<tr class="odd">
<td>Windows 7 and later: <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexecutecommand"><strong>IExecuteCommand</strong></a></td>
<td>The most direct implementation method. Because this is a COM-based invoke method (like DropTarget) this interface supports in-proc and out-of-proc activation. The verb implements <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexecutecommand"><strong>IExecuteCommand</strong></a> and <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iobjectwithselection"><strong>IObjectWithSelection</strong></a>, and optionally <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iinitializecommand"><strong>IInitializeCommand</strong></a>. The items are passed directly as a Shell item array and more of the parameters from the invoker are available to the verb implementation, including the invoke point, keyboard state, and so forth.</td>
</tr>
<tr class="even">
<td>Windows 7 and later:<strong>ExplorerCommand</strong>/ <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommand"><strong>IExplorerCommand</strong></a></td>
<td>Enables data sources that provide their command module commands through <a href="/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommandprovider"><strong>IExplorerCommandProvider</strong></a> to use those commands as verbs on a shortcut menu. Because this interface supports in-process activation only, it is recommended for use by Shell data sources that need to share the implementation between commands and shortcut menus.</td>
</tr>
</tbody>
</table>
> [!Note]
> [**IExplorerCommand**](/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommand) is a hybrid between a static and dynamic verb. **IExplorerCommand** was declared in Windows Vista, but its ability to implement a verb in a shortcut menu is new to Windows 7.
For more information about [**IDropTarget**](https://msdn.microsoft.com/en-us/library/ms679679(v=VS.85).aspx) and Shell queries for file association attributes, see [Perceived Types and Application Registration](fa-perceivedtypes.md).
### Preferred Dynamic Verb Methods
The following dynamic verb methods are preferred:
| Verb Type | Description |
|-------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Static verb (listed in the previous table) + Advanced Query Syntax (AQS) | This choice gets dynamic verb visibility.<br/> |
| Windows 7 and later: [**IExplorerCommand**](/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommand) | This choice enables a common implementation of verbs and explorer commands that are displayed in the command module in Windows Explorer. |
| Windows 7 and later: [**IExplorerCommandState**](/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommandstate) + static verb | This choice also gets dynamic verb visibility. It is a hybrid model where a simple in-process handler is used to compute if a given static verb should be displyed. This can be applied to all of the static verb implementation methods to achieve dynamic behavior and minimize the exposure of the in-process logic. [**IExplorerCommandState**](/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommandstate) has the advantage of running on a background thread, and thereby avoids UI hangs. It is considerably simpler than [**IContextMenu**](https://msdn.microsoft.com/en-us/library/Bb776095(v=VS.85).aspx). |
### Discouraged Dynamic Verb Methods
[**IContextMenu**](https://msdn.microsoft.com/en-us/library/Bb776095(v=VS.85).aspx) is the most powerful but also the most complicated method to implement. It is based on in-process COM objects that run on the thread of the caller, which usually Windows Explorer but can be any application hosting the items. **IContextMenu** supports verb visibility, ordering, and custom drawing. Some of these features have been added to the static verb features, such as an icon to be associated with a command, and [**IExplorerCommand**](/windows/desktop/api/shobjidl_core/nn-shobjidl_core-iexplorercommand) to deal with visibility.
If you must extend the shortcut menu for a file type by registering a dynamic verb for the file type, then follow the instructions provided in [Customizing a Shortcut Menu Using Dynamic Verbs](shortcut-menu-using-dynamic-verbs.md).
## Extend a Shortcut Menu
After you choose a verb method you can extend a shortcut menu for a file type by registering a static verb for the file type. For more information, see [Creating Context Menu Handlers](context-menu-handlers.md).
## Support for Verb Methods by Operating System
Support for verb invocation methods by operating system are listed in the following table.
| | | | |
|----------------------|------------|---------------|----------------------|
| | Windows XP | Windows Vista | Windows 7 and beyond |
| CreateProcess | X | X | X |
| DDE | X | X | X |
| DropTarget | X | X | X |
| ExecuteCommand | | X | X |
| ExplorerCommand | | | X |
| ExplorerCommandState | | | X |
## Related topics
<dl> <dt>
[Best Practices for Shortcut Menu Handlers and Multiple Selection Verbs](verbs-best-practices.md)
</dt> <dt>
[Creating Shortcut Menu Handlers](context-menu-handlers.md)
</dt> <dt>
[Customizing a Shortcut Menu Using Dynamic Verbs](shortcut-menu-using-dynamic-verbs.md)
</dt> <dt>
[Shortcut (Context) Menus and Shortcut Menu Handlers](context-menu.md)
</dt> <dt>
[Shortcut Menu Reference](context-menu-reference.md)
</dt> <dt>
[Verbs and File Associations](fa-verbs.md)
</dt> </dl>
| 69.21118 | 765 | 0.59221 | eng_Latn | 0.915217 |
fa32607d9f303d9f936f855bbd2192e0ce07349d | 2,551 | md | Markdown | readme.md | lee-ellam/ui | d3296daa33aa52508749219f42d3f10261afa0a4 | [
"MIT"
] | null | null | null | readme.md | lee-ellam/ui | d3296daa33aa52508749219f42d3f10261afa0a4 | [
"MIT"
] | null | null | null | readme.md | lee-ellam/ui | d3296daa33aa52508749219f42d3f10261afa0a4 | [
"MIT"
] | null | null | null | # GG.UI
A portable ui component library, available to the whole JG ecosystem. Consists of CSS, JS and HTML (output) components.
In order to make the library as modular as possible, we need to provide modules as distinct packages, using Bower and NuGet. Modules should be released using git tags and semver'd. Gulp will accomplish this automatically.
To maintain backwards compatibility with old UI submodule, I suggest splitting the existing styleguide from all of the pure modules etc that we want to remove, then move the styleguide into the new architecture, allowing us to build on it over time. This will allow GG.Web.Chrome to point to the original version of the styleguide in the new architecture.
## Proposed architecture
Everything should be separate, self contained modules. The styleguide should also be a module, to be used by the likes of GG.Web.Chrome, so that it may be cached across all of the microsites.
Modules should be separated by feature. All modules should be compiled to vanilla JavaScript and/or CSS before being released (they may also distribute source files, e.g. TypeScript and less files, but must export compiled versions) - this can be accomplished with a gulp task prior to tagging and releasing. This ensures that all packages can be used in any project regardless of architecture (we should make no assumptions as to what the end user wants to do and should make the barrier to entry as low as possible).
Any module that is library specific should include that library in its name, it should also inherit from a common module - i.e. an angular module should be attached to 'jgUtils' - or be a self contained 'app', a la 'ui.boostrap'. Modules may include any combination of CSS, JavaScript and HTML.
If the module contains JavaScript, unit tests must be included within the module folder. This is to ensure that tests pass before the module can be versioned and published.
To include serverside partials, we can create partial views with a dynamic model, i.e. `@model dynamic`, then reference the `@Model.propertyName` as usual, just pass an `ExpandoObject` to the partial.
A basic example is below:
```
./
- modules/
- global/
- bower.json
- package.json
- gulpfile.js
- src/
- css/
- styleguide.less
- base/
- _typography.less
...
- js/
- global.js
- views/
- _Partial.cshtml
- dist/
...
- futuredatevalidator-angular/
- src/
- dist/
...
...
```
| 54.276596 | 518 | 0.72599 | eng_Latn | 0.999256 |
fa329edc3f9c7e29c1337cbd21e3a62e70ada7ca | 1,148 | md | Markdown | windows-driver-docs-pr/debugger/windbg-breakpoints-preview.md | thethales/windows-driver-docs | 55455d5e0ef9b8087e36c3bac7301b0db8ce79ba | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-09T16:43:21.000Z | 2021-03-09T16:43:21.000Z | windows-driver-docs-pr/debugger/windbg-breakpoints-preview.md | thethales/windows-driver-docs | 55455d5e0ef9b8087e36c3bac7301b0db8ce79ba | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/debugger/windbg-breakpoints-preview.md | thethales/windows-driver-docs | 55455d5e0ef9b8087e36c3bac7301b0db8ce79ba | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-02-23T22:45:54.000Z | 2021-02-23T22:45:54.000Z | ---
title: WinDbg Preview - Breakpoints Menu
description: This section describes how to set and clear breakpoints using the WinDbg preview debugger.
ms.date: 08/15/2017
ms.localizationpriority: medium
---
# WinDbg Preview - Breakpoints Menu
This section describes how to work with breakpoints using the WinDbg preview debugger.
## Breakpoints Menu
Use the breakpoints menu to create new and remove existing breakpoints as well as toggle the initial breakpoint (initial breakpoint is currently kernel only).

## Breakpoints Window
Use the breakpoints window, opened via the View menu, to view, set and clear breakpoints. You can also double-click a breakpoint to open its source file.

The breakpoint window keep a running total of each time the breakpoint is hit.
The general process of working with breakpoints is similar to previous versions of WinDbg. For more information about setting breakpoints, see [Setting Breakpoints in WinDbg](setting-breakpoints-in-windbg.md).
| 42.518519 | 209 | 0.803136 | eng_Latn | 0.986882 |
fa3447d742c34e85040127ba55400c189015f264 | 45 | md | Markdown | README.md | paulosmallz/tracking_contributions | 25dd5a4cfe1ce756ea29a006cd8081a1cde7268c | [
"Apache-2.0"
] | null | null | null | README.md | paulosmallz/tracking_contributions | 25dd5a4cfe1ce756ea29a006cd8081a1cde7268c | [
"Apache-2.0"
] | null | null | null | README.md | paulosmallz/tracking_contributions | 25dd5a4cfe1ce756ea29a006cd8081a1cde7268c | [
"Apache-2.0"
] | null | null | null | # tracking_contributions
track contributions
| 15 | 24 | 0.888889 | eng_Latn | 0.824267 |
fa34b4c3d4ab07714ce1f5dc9add43bae5db0285 | 1,647 | md | Markdown | _posts/2009-12-27-昨晚我Wrong-Hole.md | backup53/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 18 | 2020-01-02T21:43:02.000Z | 2022-02-14T02:40:34.000Z | _posts/2009-12-27-昨晚我Wrong-Hole.md | wzxwj/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 3 | 2020-01-01T16:53:59.000Z | 2020-01-05T10:14:11.000Z | _posts/2009-12-27-昨晚我Wrong-Hole.md | backup53/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 13 | 2020-01-20T14:27:39.000Z | 2021-08-16T02:13:21.000Z | ---
layout: default
date: 2009-12-27
title: 昨晚我Wrong-Hole
categories: 罗马假日公寓
---
# 昨晚我Wrong-Hole
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
1楼 大 中 小 发表于 2009-12-27 17:57 只看该作者
昨晚我Wrong-Hole
http://www.56.com/u31/v_NDc5NTU1MzI.html
自己看吧!~
众神围观~
[ 本帖最后由 青春的脚步 于 2009-12-27 18:08 编辑 ]
---
[Terminusbot](https://github.com/TerminusBot) 整理,讨论请前往 [2049bbs.xyz](http://2049bbs.xyz/)
---
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
2楼 大 中 小 发表于 2009-12-27 17:59 只看该作者
擦!~ 视频不能放?
fscrazymouse
常驻天朝马勒戈壁观察员
3楼 大 中 小 发表于 2009-12-27 18:01 只看该作者
这个标题 。。貌似看过
Lian
4楼 大 中 小 发表于 2009-12-27 18:28 只看该作者
很有喜感
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
5楼 大 中 小 发表于 2009-12-27 18:29 只看该作者
中文名字可以叫《菊花颂》
Lian
6楼 大 中 小 发表于 2009-12-27 18:31 只看该作者
好男人就是一枚一元硬币,正面如1,后面菊发。。
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
7楼 大 中 小 发表于 2009-12-27 18:34 只看该作者
好女人就是一本无字的书, 不同层次的人读起来有不同的含义。
牛Sco(王二)
凉宫冬日的困惑
8楼 大 中 小 发表于 2009-12-29 09:43 只看该作者
还好不是顶到盆骨
Leonhardt
阿赖耶识研讨会 澳洲支部 现组员 准人参负犬一只兼法广搬运工(无薪,不定时)
9楼 大 中 小 发表于 2009-12-31 00:54 只看该作者
昨晚你火星了。。。
lx5885
10楼 大 中 小 发表于 2009-12-31 16:58 只看该作者
围观围观
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
11楼 大 中 小 发表于 2010-1-1 15:36 只看该作者
昨晚斯米达了~
冇二
@bocaidingding欢迎follow
12楼 大 中 小 发表于 2010-1-1 21:18 只看该作者
哈哈哈 笑死了
| 4.201531 | 89 | 0.537341 | yue_Hant | 0.998874 |
fa3581c785f53b08a80677a19ea6d417132cb36a | 1,173 | md | Markdown | host/README.md | openbsod/influx_dashboards | 454b9f45aab3507715aecc42587ec53961969024 | [
"Apache-2.0"
] | 1 | 2021-02-04T13:22:39.000Z | 2021-02-04T13:22:39.000Z | host/README.md | openbsod/influx_dashboards | 454b9f45aab3507715aecc42587ec53961969024 | [
"Apache-2.0"
] | null | null | null | host/README.md | openbsod/influx_dashboards | 454b9f45aab3507715aecc42587ec53961969024 | [
"Apache-2.0"
] | null | null | null | # host
The following dashboards are provided for use with data from the following Telegraf input plugins:
* [cpu](https://docs.influxdata.com/telegraf/latest/plugins/inputs/#cpu)
* [mem](https://docs.influxdata.com/telegraf/latest/plugins/inputs/#mem)
* [processes](https://docs.influxdata.com/telegraf/latest/plugins/inputs/#processes)
* [swap](https://docs.influxdata.com/telegraf/latest/plugins/inputs/#swap)
* [system](https://docs.influxdata.com/telegraf/latest/plugins/inputs/#system)
## Host: Compute Performance

## Telegraf Input Configuration
The following input plugin configuration is required to provide the data for these dashboards.
```
[[inputs.cpu]]
# Whether to report per-cpu stats or not
percpu = true
# Whether to report total system cpu stats or not
totalcpu = true
# If true, collect raw CPU time metrics.
collect_cpu_time = false
# If true, compute and report the sum of all non-idle CPU states.
report_active = false
[[inputs.mem]]
[[inputs.processes]]
[[inputs.swap]]
[[inputs.system]]
```
| 30.868421 | 130 | 0.752771 | eng_Latn | 0.538273 |
fa35c15695641f3e9f3beb035b46046c6da8ac20 | 469 | md | Markdown | 1. The Internet and IP/1.7. Encapsulation Principle.md | haoliangyu/introduction-to-computer-networking | 96dfbe1bfbae3439c706f4aad92ee9ae50f5c502 | [
"MIT"
] | 2 | 2019-02-06T15:19:35.000Z | 2021-05-08T01:19:39.000Z | 1. The Internet and IP/1.7. Encapsulation Principle.md | haoliangyu/introduction-to-computer-networking | 96dfbe1bfbae3439c706f4aad92ee9ae50f5c502 | [
"MIT"
] | null | null | null | 1. The Internet and IP/1.7. Encapsulation Principle.md | haoliangyu/introduction-to-computer-networking | 96dfbe1bfbae3439c706f4aad92ee9ae50f5c502 | [
"MIT"
] | null | null | null | # Encapsulation Principle
Encapsulation is the principle that lets us take protocol layers and let them easily share the storage within a package.
The data payload of a layer is the whole data + header from the previous layer. When a layer receive data from the last layer, it Encapsulates the data into a payload and send it to the next layer with its header. So layer N data is payload to layer N - 1.
Encapsulation allows you to layer recursively. See TLS model.
| 58.625 | 256 | 0.791045 | eng_Latn | 0.999701 |
fa35c3fcab3001a82d8a8119a644dc935bfae432 | 628 | md | Markdown | README.md | 2teeFruit/image-nudity | 1c9f41aa2a906de2a074eeeaa5994b9a19f6030f | [
"MIT"
] | 7 | 2021-05-26T15:55:43.000Z | 2022-03-31T07:31:44.000Z | README.md | 2teeFruit/image-nudity | 1c9f41aa2a906de2a074eeeaa5994b9a19f6030f | [
"MIT"
] | null | null | null | README.md | 2teeFruit/image-nudity | 1c9f41aa2a906de2a074eeeaa5994b9a19f6030f | [
"MIT"
] | 2 | 2021-09-27T21:02:45.000Z | 2022-03-07T11:17:51.000Z | # image-nudity
Service in Node JS to classify images.
## Usage
Build image, run container, show logs
```
docker-compose build && docker-compose up -d && docker-compose logs -f
```
## Endpoints
Classify image with extension JPG, PNG, GIF, BMP
```
http://[HOST]:[PORT]/image/?url=[IMAGE_URL]
```
Classify GIF Animated frame by frame
```
http://[HOST]:[PORT]/imagegif/?url=[IMAGE_URL]
```
## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
## License
[MIT](https://choosealicense.com/licenses/mit/) | 26.166667 | 114 | 0.724522 | eng_Latn | 0.853546 |
fa377b376a881ddcb1e63220b47209366668b360 | 2,974 | md | Markdown | _posts/2018-02-27-lung_microbiota_paper8.md | zd200572/zd200572.github.io | 4fcf29594f4890e299627eae7124351178192b6c | [
"Apache-2.0"
] | 3 | 2017-06-28T03:13:37.000Z | 2021-02-04T01:17:50.000Z | _posts/2018-02-27-lung_microbiota_paper8.md | zd200572/zd200572.github.io | 4fcf29594f4890e299627eae7124351178192b6c | [
"Apache-2.0"
] | null | null | null | _posts/2018-02-27-lung_microbiota_paper8.md | zd200572/zd200572.github.io | 4fcf29594f4890e299627eae7124351178192b6c | [
"Apache-2.0"
] | 1 | 2017-12-03T03:37:58.000Z | 2017-12-03T03:37:58.000Z | ---
layout: post
title: 肺微生物学--健康与疾病呼吸细菌学的新原则
subtitle: 微生物学习笔记
date: 2018-02-27
author: zd200572
header-img: img/post-bg-alitrip.jpg
catalog: true
tags:
- 微生物
---
> 原文来自:Dickson RP, Huffnagle GB (2015) The Lung Microbiome: New Principles for Respiratory Bacteriology in Health and Disease. PLoS Pathog 11 (7): e1004923. doi:10.1371/journal.ppat.1004923
# 简介
呼吸微生物学的原理正在被重新评估和改写,从肺无菌神话的被揭穿开始。呼吸道生态系统的“地形”在解剖学和生理学上与其他粘膜部位不同,当宿主和微生物群之间的动态动态平衡被破坏时,疾病的“地形”会发生剧烈的变化。研究人员才刚刚开始了解病毒、噬菌体和真菌对肺微生物群的贡献,因此,我们的讨论仅限于肺部的细菌微生物群。
# 肺不是无菌的
关于肺部无菌的概念仍然经常在教科书中提到,几乎总是没有被引用。如果这一说法属实,那将是非同寻常的。**细菌是非常多样化和适应性强的**;因此,地球上几乎没有一个极端的环境生态位(在氧气、pH、疏水性、温度、盐度、捕食者、营养缺乏等方面),以至于找不到细菌群落[1]。如果这个星球上罕见的无细菌环境之一是口腔下方几英寸处发现的温暖潮湿的粘膜,这将是非常值得注意的,在这个环境中,充满细菌的空气、微气溶胶和液体源源不断地流动。许多研究,可追溯到近一个世纪前,已经证明微吸入(细菌)在健康、无症状的受试者中很常见[2-5],而对吸入空气中细菌含量的了解与细菌理论本身一样古老[6]。自健康肺微生物群的第一次独立培养报告[7]以来,30多项利用分子技术进行细菌鉴定的研究已经发现了下气道细菌的证据。没有现代研究发现他们缺席的证据。
# 粘膜生物学:肺不是肠道
虽然肠道和肺都是粘膜内衬的腔内器官,具有共同的胚胎学起源,但它们的大体和微观解剖特征是非常不同的,它们的**微生物动态组成和种群有着明显的差异**。在没有呕吐或食道反流的情况下,消化道微生物的迁移是单向的(从口到肛门),并被广泛变化的物理和化学屏障所打断。为了使一种口服微生物进入盲肠,它必须忍受胃的酸性pH(~2.0)和十二指肠的碱性pH(~8.0),并与人口稠密的居民微生物群争夺资源。**相反,空气、粘液和细菌在肺中的运动是双向的,在喉和最远端的肺泡之间没有物理屏障**。因此,**肺部的微生物群比下消化道的微生物群更动态、更短暂**。胃肠道在整个9米长的过程中温度一致(37°C),而呼吸道粘膜(短半米长)是从吸入点的环境温度到肺泡核心**体温的梯度**[8]。与肠道不同的是,肺环境**富含氧气**。虽然气管和支气管和肠道一样,内衬着分泌粘液的糖基化蛋白,但肺的大部分表面积都内衬着富含脂质的**表面活性剂**,对某些细菌物种具有**抑菌**作用[9]。气道内细菌密度很小,与十二指肠相当[7],**比大肠小几个数量级**,因此细菌间代谢相互作用明显不同。最后,肠道和肺部在**宿主-细菌相互作用**的特征上存在差异。肠道内IgA水平要高得多,而肺中细菌与宿主白细胞(肺泡巨噬细胞)之间的腔外相互作用要多得多。这些**明显不同的环境条件共同导致相应的不同微生物群落**。
# 肺微生物群由三个生态因子决定
肺微生物群的组成由三个因素的平衡决定(图1)[10]:**(1)微生物进入气道,(2)从气道清除微生物,(3)由区域生长条件决定的群落成员的相对繁殖率**。微生物群的任何变化--个体内部或疾病状态--都必须由这些因素引起的扰动所致。微生物迁入的来源包括吸入空气(在到达细菌密集的上气道之前就含有104-106细菌/mm3),亚临床微量吸入上呼吸道内容物[2-5],以及直接沿气道粘膜扩散。微生物的清除是由**粘液纤毛清除、咳嗽(**甚至在健康受试者中经常发生)和**宿主免疫防御**(包括先天和适应性)驱动的。决定肺部局部生长条件的环境条件包括所有环境生态位共同的环境条件(如营养供应、温度、pH、氧分压)以及宿主炎症细胞的丰度和激活状态。在健康时,这些条件通常不利于细菌生长,导致细菌繁殖相对较少;因此,健康中肺微生物群的主要决定因素是**迁入和消除的平衡**[13-15]。然而,**在疾病期间,肺部的局部生长条件发生了巨大的变化,为细菌的选择性繁殖创造了有利的条件**。在晚期肺部疾病中,细菌定植的长期现象反映了物种的丰富增长,这些物种很好地**适应了受伤呼吸道的特定环境条件**。肺环境对这些社区成员的选择性生殖优势已经压倒了移民和消除对呼吸生态系统的动态影响。
# 口腔微生物是健康期间肺部细菌菌群的主要来源
**咽部分泌物亚临床微吸入**在健康人群中的普遍存在是一项长期建立和验证的观察[2-5]。此后,许多独立于培养的研究证实,肺的微生物群与口咽的微生物群相比,更**接近于与之竞争的源群落**:吸入空气、鼻咽或通过血源扩散的下消化道[15-18]。一项直接的个人研究和一种基于群体的大型模型都表明,**鼻腔微生物群对健康的肺部微生物群贡献很小**[15,16];鼻子的微生物群比肺的更接近皮肤。重要的是,即使通过鼻进支气管镜取样(表明上呼吸道污染对经支气管镜获取的标本的影响最小),肺部和口腔微生物之间的这种相似性也是明显的[10,19]。口咽每天产生两升唾液,比健康的鼻粘膜分泌的分泌物要大得多。肺和鼻微生物群落有可能(但没有被证明)在鼻漏增加的时候聚集在一起(例如急性病毒感染或变应性鼻炎,两者都可能引起与鼻腔微生物有关的肺部疾病的恶化,如金黄色葡萄球菌和卡他莫拉菌)。
# 疾病期间肺微生物群的变化
肺部微生物群的生态决定因素--**迁移、消灭和区域生长条件**--在急慢性肺病期间都发生了巨大变化[10]。因此,**肺部微生物组的群落成员在疾病状态下会发生改变**。在数十项研究中,比较了患病肺的微生物群和健康受试者的微生物学,几乎所有的研究都发现了社区组成上的显著差异[20]。许多人描述了慢性气道中群落丰富度(物种的数量)的增加,通常是随着群落组成从控制健康肺微生物群的***Bacteroidetes***菌门转向含有许多常见的肺相关革兰氏阴性杆菌的门-***Proteobacteria***菌门。肺微生物区系的基线差异与慢性肺疾病的重要临床特征有关:支气管扩张症的继发恶化频率[21],特发性肺纤维化的死亡率[22],哮喘对皮质类固醇和抗生素的反应性[23,24]。该领域活跃的研究课题包括:(1)肺微生物学组是否参与疾病的发病,还是仅仅是损伤和炎症的标志;(2)肺微粒体是否能被治疗性地操纵以改变恶化频率或疾病进展;(3)肺生态系统的**多样性和动态平衡是如何崩溃的**,并由肺炎中单一的优势病原体主导[14,25]。 | 78.263158 | 590 | 0.831204 | yue_Hant | 0.321024 |
fa379dd3c85aafda7366776cd106c44d759a1311 | 8,442 | md | Markdown | docs/commands/container.md | ThirtyThirds/scaleway-cli | e2b192355d5a83e235282d187ce4a08f120202cc | [
"Apache-2.0"
] | null | null | null | docs/commands/container.md | ThirtyThirds/scaleway-cli | e2b192355d5a83e235282d187ce4a08f120202cc | [
"Apache-2.0"
] | 1 | 2022-03-09T13:16:16.000Z | 2022-03-09T13:16:16.000Z | docs/commands/container.md | ThirtyThirds/scaleway-cli | e2b192355d5a83e235282d187ce4a08f120202cc | [
"Apache-2.0"
] | null | null | null | <!-- DO NOT EDIT: this file is automatically generated using scw-doc-gen -->
# Documentation for `scw container`
Container as a Service API
- [Container management commands](#container-management-commands)
- [Create a new container](#create-a-new-container)
- [Delete a container](#delete-a-container)
- [Deploy a container](#deploy-a-container)
- [Get a container](#get-a-container)
- [List all your containers](#list-all-your-containers)
- [Update an existing container](#update-an-existing-container)
- [Cron management commands](#cron-management-commands)
- [Delete an existing cron](#delete-an-existing-cron)
- [Get a cron](#get-a-cron)
- [List all your crons](#list-all-your-crons)
- [Namespace management commands](#namespace-management-commands)
- [Create a new namespace](#create-a-new-namespace)
- [Delete an existing namespace](#delete-an-existing-namespace)
- [Get a namespace](#get-a-namespace)
- [List all your namespaces](#list-all-your-namespaces)
- [Update an existing namespace](#update-an-existing-namespace)
## Container management commands
Container management commands.
### Create a new container
Create a new container.
**Usage:**
```
scw container container create [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| namespace-id | | |
| name | Default: `<generated>` | |
| environment-variables.value.{key} | | |
| min-scale | | |
| max-scale | | |
| memory-limit | | |
| timeout.seconds | | |
| timeout.nanos | | |
| privacy | One of: `unknown_privacy`, `public`, `private` | |
| description | | |
| registry-image | | |
| max-concurrency | | |
| domain-name | | |
| protocol | One of: `unknown_protocol`, `http1`, `h2c` | |
| port | | |
| secret-environment-variables.{index}.key | | |
| secret-environment-variables.{index}.value | | |
| http-option | | Configure how HTTP and HTTPS requests are handled |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Delete a container
Delete the container associated with the given id.
**Usage:**
```
scw container container delete <container-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| container-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Deploy a container
Deploy a container associated with the given id.
**Usage:**
```
scw container container deploy <container-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| container-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Get a container
Get the container associated with the given id.
**Usage:**
```
scw container container get <container-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| container-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### List all your containers
List all your containers.
**Usage:**
```
scw container container list [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| order-by | One of: `created_at_asc`, `created_at_desc`, `name_asc`, `name_desc` | |
| namespace-id | | |
| name | | |
| project-id | | |
| organization-id | | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Update an existing container
Update the container associated with the given id.
**Usage:**
```
scw container container update <container-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| container-id | Required | |
| environment-variables.value.{key} | | |
| min-scale | | |
| max-scale | | |
| memory-limit | | |
| timeout.seconds | | |
| timeout.nanos | | |
| redeploy | | |
| privacy | One of: `unknown_privacy`, `public`, `private` | |
| description | | |
| registry-image | | |
| max-concurrency | | |
| domain-name | | |
| protocol | One of: `unknown_protocol`, `http1`, `h2c` | |
| port | | |
| secret-environment-variables.{index}.key | | |
| secret-environment-variables.{index}.value | | |
| http-option | | Configure how HTTP and HTTPS requests are handled |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
## Cron management commands
Cron management commands.
### Delete an existing cron
Delete the cron associated with the given id.
**Usage:**
```
scw container cron delete <cron-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| cron-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Get a cron
Get the cron associated with the given id.
**Usage:**
```
scw container cron get <cron-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| cron-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### List all your crons
List all your crons.
**Usage:**
```
scw container cron list [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| order-by | One of: `created_at_asc`, `created_at_desc` | |
| container-id | | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
## Namespace management commands
Namespace management commands.
### Create a new namespace
Create a new namespace.
**Usage:**
```
scw container namespace create [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| name | Default: `<generated>` | |
| environment-variables.value.{key} | | |
| project-id | | Project ID to use. If none is passed the default project ID will be used |
| description | | |
| secret-environment-variables.{index}.key | | |
| secret-environment-variables.{index}.value | | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Delete an existing namespace
Delete the namespace associated with the given id.
**Usage:**
```
scw container namespace delete <namespace-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| namespace-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Get a namespace
Get the namespace associated with the given id.
**Usage:**
```
scw container namespace get <namespace-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| namespace-id | Required | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### List all your namespaces
List all your namespaces.
**Usage:**
```
scw container namespace list [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| order-by | One of: `created_at_asc`, `created_at_desc`, `name_asc`, `name_desc` | |
| name | | |
| project-id | | |
| organization-id | | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
### Update an existing namespace
Update the space associated with the given id.
**Usage:**
```
scw container namespace update <namespace-id ...> [arg=value ...]
```
**Args:**
| Name | | Description |
|------|---|-------------|
| namespace-id | Required | |
| environment-variables.value.{key} | | |
| description | | |
| secret-environment-variables.{index}.key | | |
| secret-environment-variables.{index}.value | | |
| region | Default: `fr-par`<br />One of: `fr-par` | Region to target. If none is passed will use default region from the config |
| 22.816216 | 130 | 0.614665 | eng_Latn | 0.926008 |
fa38820a49cb3d89f592849b3ff601fa83d9c85e | 796 | md | Markdown | servers.md | tym-xqo/dblkp | 16021200290f22062bdeb415640d0b97b6a6dfcd | [
"WTFPL"
] | null | null | null | servers.md | tym-xqo/dblkp | 16021200290f22062bdeb415640d0b97b6a6dfcd | [
"WTFPL"
] | null | null | null | servers.md | tym-xqo/dblkp | 16021200290f22062bdeb415640d0b97b6a6dfcd | [
"WTFPL"
] | null | null | null | # BenchPrep database servers by location
## SJC
| host | public | private |
|-----------|---------------|----------------|
| bastion01 | 169.45.87.197 | 10.161.109.219 |
| nfs | 169.45.86.44 | 10.161.109.194 |
| db01 | 169.45.86.45 | 10.161.109.201 |
| db02 | 169.45.86.46 | 10.161.109.215 |
| boost01 | 169.45.86.42 | 10.161.109.248 |
## DAL
| host | public | private |
|---------------|----------------|---------------|
| bastion02 | 169.63.216.219 | 10.176.247.54 |
| db03 | 169.46.66.86 | 10.176.247.2 |
| db04 | 169.46.66.83 | 10.176.247.15 |
| db05 | 169.46.45.21 | 10.176.230.70 |
| db-lab-dal-01 | 169.46.45.29 | 10.176.230.78 |
| boost02 | 169.46.66.93 | 10.176.247.38 |
| 34.608696 | 50 | 0.44598 | yue_Hant | 0.059655 |
fa3925913164723a6959c51f68a34958c9f91597 | 4,788 | md | Markdown | ch7/ch7-03.md | charliexp/gopl-zh | 688c50f38fcb797e95810acd6f5e0d810eea1eac | [
"BSD-3-Clause"
] | 67 | 2016-02-10T17:06:05.000Z | 2021-12-29T23:35:58.000Z | ch7/ch7-03.md | shesuyo/gopl-zh | 688c50f38fcb797e95810acd6f5e0d810eea1eac | [
"BSD-3-Clause"
] | null | null | null | ch7/ch7-03.md | shesuyo/gopl-zh | 688c50f38fcb797e95810acd6f5e0d810eea1eac | [
"BSD-3-Clause"
] | 54 | 2016-02-04T01:30:42.000Z | 2021-05-15T01:48:49.000Z | ## 7.3. 實現接口的條件
一個類型如果擁有一個接口需要的所有方法,那麽這個類型就實現了這個接口。例如,\*os.File類型實現了io.Reader,Writer,Closer,和ReadWriter接口。\*bytes.Buffer實現了Reader,Writer,和ReadWriter這些接口,但是它沒有實現Closer接口因爲它不具有Close方法。Go的程序員經常會簡要的把一個具體的類型描述成一個特定的接口類型。舉個例子,\*bytes.Buffer是io.Writer;\*os.Files是io.ReadWriter。
接口指定的規則非常簡單:表達一個類型屬於某個接口隻要這個類型實現這個接口。所以:
```go
var w io.Writer
w = os.Stdout // OK: *os.File has Write method
w = new(bytes.Buffer) // OK: *bytes.Buffer has Write method
w = time.Second // compile error: time.Duration lacks Write method
var rwc io.ReadWriteCloser
rwc = os.Stdout // OK: *os.File has Read, Write, Close methods
rwc = new(bytes.Buffer) // compile error: *bytes.Buffer lacks Close method
```
這個規則甚至適用於等式右邊本身也是一個接口類型
```go
w = rwc // OK: io.ReadWriteCloser has Write method
rwc = w // compile error: io.Writer lacks Close method
```
因爲ReadWriter和ReadWriteCloser包含所有Writer的方法,所以任何實現了ReadWriter和ReadWriteCloser的類型必定也實現了Writer接口
在進一步學習前,必須先解釋表示一個類型持有一個方法當中的細節。迴想在6.2章中,對於每一個命名過的具體類型T;它一些方法的接收者是類型T本身然而另一些則是一個*T的指針。還記得在T類型的參數上調用一個*T的方法是合法的,隻要這個參數是一個變量;編譯器隱式的獲取了它的地址。但這僅僅是一個語法糖:T類型的值不擁有所有*T指針的方法,那這樣它就可能隻實現更少的接口。
舉個例子可能會更清晰一點。在第6.5章中,IntSet類型的String方法的接收者是一個指針類型,所以我們不能在一個不能尋址的IntSet值上調用這個方法:
```go
type IntSet struct { /* ... */ }
func (*IntSet) String() string
var _ = IntSet{}.String() // compile error: String requires *IntSet receiver
```
但是我們可以在一個IntSet值上調用這個方法:
```go
var s IntSet
var _ = s.String() // OK: s is a variable and &s has a String method
```
然而,由於隻有*IntSet類型有String方法,所有也隻有*IntSet類型實現了fmt.Stringer接口:
```go
var _ fmt.Stringer = &s // OK
var _ fmt.Stringer = s // compile error: IntSet lacks String method
```
12.8章包含了一個打印出任意值的所有方法的程序,然後可以使用godoc -analysis=type tool(§10.7.4)展示每個類型的方法和具體類型和接口之間的關繫
就像信封封裝和隱藏信件起來一樣,接口類型封裝和隱藏具體類型和它的值。卽使具體類型有其它的方法也隻有接口類型暴露出來的方法會被調用到:
```go
os.Stdout.Write([]byte("hello")) // OK: *os.File has Write method
os.Stdout.Close() // OK: *os.File has Close method
var w io.Writer
w = os.Stdout
w.Write([]byte("hello")) // OK: io.Writer has Write method
w.Close() // compile error: io.Writer lacks Close method
```
一個有更多方法的接口類型,比如io.ReadWriter,和少一些方法的接口類型,例如io.Reader,進行對比;更多方法的接口類型會告訴我們更多關於它的值持有的信息,併且對實現它的類型要求更加嚴格。那麽關於interface{}類型,它沒有任何方法,請講出哪些具體的類型實現了它?
這看上去好像沒有用,但實際上interface{}被稱爲空接口類型是不可或缺的。因爲空接口類型對實現它的類型沒有要求,所以我們可以將任意一個值賦給空接口類型。
```go
var any interface{}
any = true
any = 12.34
any = "hello"
any = map[string]int{"one": 1}
any = new(bytes.Buffer)
```
盡管不是很明顯,從本書最早的的例子中我們就已經在使用空接口類型。它允許像fmt.Println或者5.7章中的errorf函數接受任何類型的參數。
對於創建的一個interface{}值持有一個boolean,float,string,map,pointer,或者任意其它的類型;我們當然不能直接對它持有的值做操作,因爲interface{}沒有任何方法。我們會在7.10章中學到一種用類型斷言來獲取interface{}中值的方法。
因爲接口實現隻依賴於判斷的兩個類型的方法,所以沒有必要定義一個具體類型和它實現的接口之間的關繫。也就是説,嚐試文檔化和斷言這種關繫幾乎沒有用,所以併沒有通過程序強製定義。下面的定義在編譯期斷言一個*bytes.Buffer的值實現了io.Writer接口類型:
```go
// *bytes.Buffer must satisfy io.Writer
var w io.Writer = new(bytes.Buffer)
```
因爲任意*bytes.Buffer的值,甚至包括nil通過(*bytes.Buffer)(nil)進行顯示的轉換都實現了這個接口,所以我們不必分配一個新的變量。併且因爲我們絶不會引用變量w,我們可以使用空標識符來來進行代替。總的看,這些變化可以讓我們得到一個更樸素的版本:
```go
// *bytes.Buffer must satisfy io.Writer
var _ io.Writer = (*bytes.Buffer)(nil)
```
非空的接口類型比如io.Writer經常被指針類型實現,尤其當一個或多個接口方法像Write方法那樣隱式的給接收者帶來變化的時候。一個結構體的指針是非常常見的承載方法的類型。
但是併不意味着隻有指針類型滿足接口類型,甚至連一些有設置方法的接口類型也可能會被Go語言中其它的引用類型實現。我們已經看過slice類型的方法(geometry.Path, §6.1)和map類型的方法(url.Values, §6.2.1),後面還會看到函數類型的方法的例子(http.HandlerFunc, §7.7)。甚至基本的類型也可能會實現一些接口;就如我們在7.4章中看到的time.Duration類型實現了fmt.Stringer接口。
一個具體的類型可能實現了很多不相關的接口。考慮在一個組織出售數字文化産品比如音樂,電影和書籍的程序中可能定義了下列的具體類型:
```
Album
Book
Movie
Magazine
Podcast
TVEpisode
Track
```
我們可以把每個抽象的特點用接口來表示。一些特性對於所有的這些文化産品都是共通的,例如標題,創作日期和作者列表。
```go
type Artifact interface {
Title() string
Creators() []string
Created() time.Time
}
```
其它的一些特性隻對特定類型的文化産品才有。和文字排版特性相關的隻有books和magazines,還有隻有movies和TV劇集和屏幕分辨率相關。
```go
type Text interface {
Pages() int
Words() int
PageSize() int
}
type Audio interface {
Stream() (io.ReadCloser, error)
RunningTime() time.Duration
Format() string // e.g., "MP3", "WAV"
}
type Video interface {
Stream() (io.ReadCloser, error)
RunningTime() time.Duration
Format() string // e.g., "MP4", "WMV"
Resolution() (x, y int)
}
```
這些接口不止是一種有用的方式來分組相關的具體類型和表示他們之間的共同特定。我們後面可能會發現其它的分組。舉例,如果我們發現我們需要以同樣的方式處理Audio和Video,我們可以定義一個Streamer接口來代表它們之間相同的部分而不必對已經存在的類型做改變。
```go
type Streamer interface {
Stream() (io.ReadCloser, error)
RunningTime() time.Duration
Format() string
}
```
每一個具體類型的組基於它們相同的行爲可以表示成一個接口類型。不像基於類的語言,他們一個類實現的接口集合需要進行顯式的定義,在Go語言中我們可以在需要的時候定義一個新的抽象或者特定特點的組,而不需要脩改具體類型的定義。當具體的類型來自不同的作者時這種方式會特别有用。當然也確實沒有必要在具體的類型中指出這些共性。
| 31.5 | 255 | 0.743943 | yue_Hant | 0.979964 |
fa399c3eacffb4997701724bd03c325c5eb890a1 | 1,351 | md | Markdown | AlchemyInsights/enable-send-immediatly-when-connected.md | isabella232/OfficeDocs-AlchemyInsights-pr.sk-SK | 8fd17c4b16a3370a34bbb761d4edc9f8dfb2784a | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T19:07:47.000Z | 2021-04-21T00:13:49.000Z | AlchemyInsights/enable-send-immediatly-when-connected.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.sk-SK | a4bdea6e681ab468fe3d8f0c6ea9c8f4f1c0af6f | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-06-02T23:29:32.000Z | 2022-02-09T06:59:11.000Z | AlchemyInsights/enable-send-immediatly-when-connected.md | isabella232/OfficeDocs-AlchemyInsights-pr.sk-SK | 8fd17c4b16a3370a34bbb761d4edc9f8dfb2784a | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-09T20:33:51.000Z | 2021-10-09T10:38:46.000Z | ---
title: Zapnutie funkcie Odoslať okamžite po pripojení
ms.author: pebaum
author: pebaum
manager: scotv
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "2713"
- "9000768"
- "9002385"
- "4645"
ms.openlocfilehash: 27b4caf4d1f3fcaa16031ee8d80dd01bda1cc1bc1511983632ebbabf82f8ecbc
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: sk-SK
ms.lasthandoff: 08/05/2021
ms.locfileid: "54117895"
---
# <a name="enable-send-immediately-when-connected"></a>Zapnutie funkcie Odoslať okamžite po pripojení
1. Na karte Súbor kliknite na položku **Možnosti**.
2. V dialógovom Outlook Možnosti vyhľadávania kliknite na položku **Rozšírené**.
3. V časti Odoslať a prijať kliknutím okamžite po pripojení zapnite **možnosť Odoslať.** Kliknite na tlačidlo **OK**.
Úplné podrobnosti nájdete v téme:
- [Video: Odoslanie alebo odstránenie uviaznutého e-mailu](https://support.office.com/article/Video-Send-or-delete-an-email-stuck-in-your-outbox-26d5d34a-4e5f-444a-a9e8-44db04a94dec)
- [E-mail zostane v priečinku Pošta na odoslanie, kým manuálne nezačnú operáciu odosielania a Outlook](https://support.microsoft.com/help/2797572/email-stays-in-the-outbox-folder-until-you-manually-initiate-a-send-re)
| 38.6 | 217 | 0.799408 | slk_Latn | 0.91859 |
fa3a9290e8a983ba82d02c5cc45516960c38302b | 435 | md | Markdown | README.md | attester/attester-junit | ade90657fb797a63f04344088641840dca5117eb | [
"Apache-2.0"
] | null | null | null | README.md | attester/attester-junit | ade90657fb797a63f04344088641840dca5117eb | [
"Apache-2.0"
] | 4 | 2020-03-04T21:41:44.000Z | 2021-12-09T19:49:16.000Z | README.md | attester/attester-junit | ade90657fb797a63f04344088641840dca5117eb | [
"Apache-2.0"
] | null | null | null | Attester - Junit
================
*attester-junit* is a Junit runner which allows Javascript tests to be run as if they were JUnit tests.
[attester](https://github.com/ariatemplates/attester) is used in the background to execute the tests and get the results.
This tool is useful to integrate attester with any Junit-compatible program (Eclipse, ...).
Usage
-----
*Documentation about this tool will be added in the coming weeks.*
| 33.461538 | 121 | 0.737931 | eng_Latn | 0.998494 |
fa3ae0a2efcc885f6ce65186289f478f2201ae2e | 1,720 | md | Markdown | includes/vpn-gateway-add-lng-rm-portal-include.md | ggailey777/azure-docs | 4520cf82cb3d15f97877ba445b0cfd346c81a034 | [
"CC-BY-3.0"
] | null | null | null | includes/vpn-gateway-add-lng-rm-portal-include.md | ggailey777/azure-docs | 4520cf82cb3d15f97877ba445b0cfd346c81a034 | [
"CC-BY-3.0"
] | null | null | null | includes/vpn-gateway-add-lng-rm-portal-include.md | ggailey777/azure-docs | 4520cf82cb3d15f97877ba445b0cfd346c81a034 | [
"CC-BY-3.0"
] | 1 | 2019-03-31T17:25:38.000Z | 2019-03-31T17:25:38.000Z | 1. In the portal, from **All resources**, click **+Add**. In the **Everything** blade search box, type **Local network gateway**, then click to search. This will return a list. Click **Local network gateway** to open the blade, then click **Create** to open the **Create local network gateway** blade.

2. On the **Create local network gateway blade**, specify a **Name** for your local network gateway object.
3. Specify a valid public **IP address** for the VPN device or virtual network gateway to which you want to connect.<br>If this local network represents an on-premises location, this is the public IP address of the VPN device that you want to connect to. It cannot be behind NAT and has to be reachable by Azure.<br>If this local network represents another VNet, you will specify the public IP address that was assigned to the virtual network gateway for that VNet.<br>
4. **Address Space** refers to the address ranges for the network that this local network represents. You can add multiple address space ranges. Make sure that the ranges you specify here do not overlap with ranges of other networks that you want to connect to.
5. For **Subscription**, verify that the correct subscription is showing.
6. For **Resource Group**, select the resource group that you want to use. You can either create a new resource group, or select one that you have already created.
7. For **Location**, select the location that this object will be created in. You may want to select the same location that your VNet resides in, but you are not required to do so.
8. Click **Create** to create the local network gateway.
| 143.333333 | 469 | 0.763372 | eng_Latn | 0.999347 |
fa3affa63cc7af6256528b4aea5a0e15a4dc1468 | 2,323 | md | Markdown | content/publication/pias-ton.md | li-ch/resume | 7c2f9a9fd9e50fc0ec84fc13c4c36682dc7e181c | [
"MIT"
] | null | null | null | content/publication/pias-ton.md | li-ch/resume | 7c2f9a9fd9e50fc0ec84fc13c4c36682dc7e181c | [
"MIT"
] | null | null | null | content/publication/pias-ton.md | li-ch/resume | 7c2f9a9fd9e50fc0ec84fc13c4c36682dc7e181c | [
"MIT"
] | null | null | null | +++
url_dataset = ""
abstract = "Many existing data center network DCN flow scheduling schemes, that minimize flow completion times FCT assume prior knowledge of flows and custom switch functions, making them superior in performance but hard to implement in practice. By contrast, we seek to minimize FCT with no prior knowledge and existing commodity switch hardware. To this end, we present PIAS, a DCN flow scheduling mechanism that aims to minimize FCT by mimicking shortest job first SJF on the premise that flow size is not known a priori. At its heart, PIAS leverages multiple priority queues available in existing commodity switches to implement a multiple level feedback queue, in which a PIAS flow is gradually demoted from higher-priority queues to lower-priority queues based on the number of bytes it has sent. As a result, short flows are likely to be finished in the first few high-priority queues and thus be prioritized over long flows in general, which enables PIAS to emulate SJF without knowing flow sizes beforehand. We have implemented a PIAS prototype and evaluated PIAS through both testbed experiments and ns-2 simulations. We show that PIAS is readily deployable with commodity switches and backward compatible with legacy TCP/IP stacks. Our evaluation results show that PIAS significantly outperforms existing information-agnostic schemes, for example, it reduces FCT by up to 50% compared to DCTCP and L2DCT; and it only has a 1.1% performance gap to an ideal information-aware scheme, pFabric, for short flows under a production DCN workload."
authors = [
"Wei Bai", "Li Chen", "Kai Chen", "Dongsu Han", "Chen Tian", "Hao Wang"
]
math = false
selected = false
url_code = ""
url_slides = ""
publication_short = "IEEE/ACM ToN"
title = "PIAS: Practical Information-Agnostic Flow Scheduling for Commodity Data Centers"
image = ""
image_preview = ""
url_project = ""
date = "2017-08-02T14:11:59+08:00"
abstract_short = ""
publication = "IEEE/ACM Transactions on Networking (TON), Volume 25 Issue 4, August 2017, Page 1954-1967"
url_video = ""
url_pdf = "https://dl.acm.org/citation.cfm?id=3148779"
# Publication type.
# Legend:
# 0 = Uncategorized
# 1 = Conference proceedings
# 2 = Journal
# 3 = Work in progress
# 4 = Technical report
# 5 = Book
# 6 = Book chapter
publication_types = ["2"]
+++
| 72.59375 | 1,551 | 0.775291 | eng_Latn | 0.987872 |
fa3b4a1aa91a5a9d9645dc09649206f5d3d0eacc | 5,137 | md | Markdown | _posts/2021-01-15-jpa-basic2.md | cotchan/deprecated | 9903ff810fed843a2f5108f2bcbf7a9acee9505b | [
"MIT"
] | null | null | null | _posts/2021-01-15-jpa-basic2.md | cotchan/deprecated | 9903ff810fed843a2f5108f2bcbf7a9acee9505b | [
"MIT"
] | null | null | null | _posts/2021-01-15-jpa-basic2.md | cotchan/deprecated | 9903ff810fed843a2f5108f2bcbf7a9acee9505b | [
"MIT"
] | null | null | null | ---
title: JPA) JPA 기본 개념2
author: cotchan
date: 2021-01-15 08:05:21 +0800
categories: [JPA, JPA_Basic]
tags: [jpa]
---
+ **이 포스팅은 개인 공부 목적으로 작성한 포스팅입니다**
---
## 1. 객체와 테이블 생성하고 매핑하기
+ **@Entity**
+ **@Id**
---
+ **`@Entity`**
+ JPA가 관리할 객체라고 지정해줍니다.
```java
import javax.persistence.Entity;
import javax.persistence.Id;
@Entity
//@Table(name = "USER") //DB 테이블에 USER라고 되어있으면, 쿼리가 USER라는 테이블에 나갑니다.
public class Member {
@Id
private Long id;
// @Column(name = "user_name") //DB column이 실제로는 user_name인 경우
private String name;
}
```
+ **`@Id`**
+ 데이터베이스의 PK와 매핑됩니다.
---
+ 위 Entity는 아래와 같은 Table에 매핑됩니다.
```java
create table Member (
id bigint not null,
name varchar(255),
primary key(id)
);
```
---
## 2. EntityManagerFactory, EntityManager Sample Code
+ `JpaMain.java`
```java
package hellojpa;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.EntityTransaction;
import javax.persistence.Persistence;
import java.util.List;
public class JpaMain {
public static void main(String[] args) {
EntityManagerFactory emf = Persistence.createEntityManagerFactory("hello");
/**
* 웹서버가 올라오는 시점에 DB당 딱 하나만 생성된다고 보면 됩니다.
*/
EntityManager em = emf.createEntityManager();
/**
* 엔티티 매니저는 고객의 요청이 올 때 마다 썼다가, 버렸다가 하는 식으로 동작한다고 보면 됩니다.
* 엔티티 매니저는 절대로 쓰레드 간에 공유를 해서는 안됩니다.
*/
EntityTransaction tx = em.getTransaction();
/**
* code
* JPA에서 모든 변경작업은 transaction 안에서 진행해야 합니다.
*/
//트랜잭션 시작
tx.begin();
try {
//저장
// Member member = new Member();
// member.setId(2L);
// member.setName("HelloB");
//
// em.persist(member); //저장
//조회
// Member findMember = em.find(Member.class, 1L);
// System.out.println("findMember.id = " + findMember.getId());
// System.out.println("findMember.name = " + findMember.getName());
//삭제
// Member findMember = em.find(Member.class, 1L);
// em.remove(findMember);
//수정
Member findMember = em.find(Member.class, 1L);
findMember.setName("HelloJPA");
/**
* hibernate.show_sql = 로그상에 Println으로 실제 쿼리가 보입니다.
* hibernate.format_sql = 형식에 맞게 예쁘게 출력해줌
* hibernate.use_sql_comments = /* insert hellojpa.Member 라는 부분을 보여줍니다.
*/
//JPA는 멤버 객체를 대상으로 쿼리를 날립니다.(TABLE이 X)
List<Member> result = em.createQuery("select m from Member as m", Member.class)
.getResultList();
for (Member member : result) {
System.out.println("member.name" + member.getName());
}
tx.commit();
} catch (Exception e) {
tx.rollback();
} finally {
em.close();
}
emf.close();
}
}
```
---
+ `persistence.xml`
```xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.2"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_2.xsd">
<!-- 데이터 베이스 이름 -->
<persistence-unit name="hello">
<properties>
<!-- 필수 속성 -->
<property name="javax.persistence.jdbc.driver" value="org.h2.Driver"/>
<property name="javax.persistence.jdbc.user" value="sa"/>
<property name="javax.persistence.jdbc.password" value=""/>
<property name="javax.persistence.jdbc.url" value="jdbc:h2:tcp://localhost/~/test"/>
<property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
<!-- 옵션 -->
<property name="hibernate.show_sql" value="true"/>
<property name="hibernate.format_sql" value="true"/>
<property name="hibernate.use_sql_comments" value="true"/>
<!--<property name="hibernate.hbm2ddl.auto" value="create" />-->
</properties>
</persistence-unit>
</persistence>
```
---
## 3. 주의사항
1. `엔티티 매니저 팩토리`는 하나만 생성해서 애플리케이션 전체에서 공유합니다.
+ 웹서버가 올라오는 시점에 DB당 딱 하나만 생성된다고 보면 됩니다.
2. `엔티티 매니저`는 절대로 쓰레드 간에 공유해서는 안 됩니다.
+ 엔티티 매니저는 고객의 요청이 올 때 마다 썼다가, 버렸다가 하는 식으로 동작한다고 보면 됩니다.
3. **JPA의 `모든 데이터 변경은` `트랜잭션 안에서` 실행됩니다.**
---
## 4. JPQL이란
+ JPA를 사용하면 `엔티티 객체를 중심으로` 개발하게 됩니다.
+ 문제는 검색 쿼리입니다.
+ 검색을 할 때도 테이블이 아닌 엔티티 객체를 대상으로 검색합니다.
+ 모든 DB 데이터를 객체로 변환해서 검색하는 것은 불가능합니다.
+ 따라서 애플리케이션이 필요한 데이터만 DB에서 불러오려면 결국 검색 조건이 포함된 SQL이 필요합니다.
---
+ **JPA는 SQL을 추상화한 JPQL이라는 객체 지향 쿼리 언어를 제공합니다.**
+ SQL과 유사한 문법입니다. SELECT, FROM, WHERE, GROUP BY, HAVING, JOIN을 지원합니다.
+ **`JPQL은 엔티티 객체`를 대상으로 쿼리**
+ **`SQL은 데이터베이스 테이블`을 대상으로 쿼리**
---
+ 테이블이 아닌 **객체를 대상으로 검색하는 객체 지향 쿼리**
+ SQL을 추상화해서 특정 데이터베이스 SQL에 의존하지 않습니다.
+ JPQL을 한마디로 정의하면 객체 지향 SQL입니다.
---
+ 출처
+ [자바 ORM 표준 JPA 프로그래밍 - 기본편](https://www.inflearn.com/course/ORM-JPA-Basic)
| 23.672811 | 134 | 0.579716 | kor_Hang | 0.996745 |
fa3bed2c9bf0c6ed24617c6bc1210aebbbc91401 | 689 | md | Markdown | docs/_rules/xUnit1002.md | sharwell/xunit.analyzers | 00fec62fcd1219e1f6e4510560faacc2fe00feee | [
"Apache-2.0"
] | null | null | null | docs/_rules/xUnit1002.md | sharwell/xunit.analyzers | 00fec62fcd1219e1f6e4510560faacc2fe00feee | [
"Apache-2.0"
] | null | null | null | docs/_rules/xUnit1002.md | sharwell/xunit.analyzers | 00fec62fcd1219e1f6e4510560faacc2fe00feee | [
"Apache-2.0"
] | null | null | null | ---
title: xUnit1002
description: Test methods cannot have multiple Fact or Theory attributes
category: Usage
severity: Error
---
## Cause
A test method has multiple Fact or Theory attributes.
## Reason for rule
A test method only needs one Fact or Theory attribute.
## How to fix violations
To fix a violation of this rule, remove all but one of the Fact or Theory attributes.
## Examples
### Violates
```csharp
public class TestClass
{
[Fact, Theory]
public void TestMethod()
{
}
}
```
### Does not violate
```csharp
public class TestClass
{
[Fact]
public void TestMethod()
{
}
}
```
```csharp
public class TestClass
{
[Theory]
public void TestMethod()
{
}
}
```
| 12.527273 | 85 | 0.698113 | eng_Latn | 0.926596 |
fa3bf3278a0934253d99017831dc8d2f29496bfe | 4,705 | md | Markdown | docs/windows/creating-transparent-or-inverse-regions-in-device-images.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/windows/creating-transparent-or-inverse-regions-in-device-images.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/windows/creating-transparent-or-inverse-regions-in-device-images.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Criando regiões transparentes ou inversas em imagens de dispositivo (Editor de imagens para ícones) | Microsoft Docs
ms.custom: ''
ms.date: 11/04/2016
ms.technology:
- cpp-windows
ms.topic: conceptual
dev_langs:
- C++
helpviewer_keywords:
- cursors [C++], screen regions
- inverse colors [C++], device images
- transparent regions, device images
- transparency, device images
- Image editor [C++], device images
- inverse regions, device images
- cursors [C++], transparent regions
- screen colors
- regions, transparent
- icons [C++], transparent regions
- display devices [C++], transparent and screen regions
- transparent regions in devices
- regions, inverse
- colors [C++], Image editor
- device projects [C++], transparent images
- icons [C++], screen regions
ms.assetid: a994954b-b039-4391-a535-58d1fa10fc3b
author: mikeblome
ms.author: mblome
ms.workload:
- cplusplus
- uwp
ms.openlocfilehash: 2789d5c70dfe4569ef8eaa1d17136ad037efd18a
ms.sourcegitcommit: 799f9b976623a375203ad8b2ad5147bd6a2212f0
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 09/19/2018
ms.locfileid: "46374549"
---
# <a name="creating-transparent-or-inverse-regions-in-device-images-image-editor-for-icons"></a>Criando regiões transparentes ou inversas em imagens de dispositivo (editor de imagens para ícones)
No [editor de imagens](../windows/image-editor-for-icons.md), a imagem de ícone ou cursor inicial tem um atributo transparente. Embora as imagens de ícone e cursor são retangulares, muitos não aparecem isso porque partes da imagem são transparentes; mostra a imagem subjacente na tela por meio do ícone ou cursor. Quando você arrasta um ícone, partes da imagem podem aparecer em uma cor invertida. Criar esse efeito, definindo a cor da tela e as cores invertidas na [janela de cores](../windows/colors-window-image-editor-for-icons.md).
As cores da tela e o inverso é aplicar a ícones e cursores da forma e a imagem derivada de cor ou designarem Inverter regiões. As cores indicam partes da imagem que possui os atributos. Você pode alterar as cores que representam os atributos de cor da tela e cor inverso no modo de edição. Essas alterações não afetam a aparência do ícone ou cursor em seu aplicativo.
> [!NOTE]
> As caixas de diálogo e os comandos de menu vistos podem ser diferentes daqueles descritos na **Ajuda**, dependendo da edição ou das configurações ativas. Para alterar as configurações, escolha **Importar e Exportar Configurações** no menu **Ferramentas**. Para obter mais informações, confira [Personalizar o IDE do Visual Studio](/visualstudio/ide/personalizing-the-visual-studio-ide).
### <a name="to-create-transparent-or-inverse-regions"></a>Para criar regiões transparentes ou inversas
1. No **cores** janela, clique no **cor da tela** seletor ou o **inverso cor** seletor.
2. Aplique a tela ou cores invertidas em sua imagem usando uma ferramenta de desenho. Para obter mais informações sobre ferramentas de desenho, consulte [usando uma ferramenta de desenho](using-a-drawing-tool-image-editor-for-icons.md).
### <a name="to-change-the-screen-or-inverse-color"></a>Para alterar a cor da tela ou inverso
1. Selecione o **cores da tela** seletor ou o **inverso cor** seletor.
2. Escolha uma cor na **cores** paleta na **cores** janela.
A cor complementar é automaticamente designada para o seletor de outro.
> [!TIP]
> Se você clicar duas vezes o **cores da tela** ou **inverso cor** seletor, o [caixa de diálogo do seletor de cores personalizada](../windows/custom-color-selector-dialog-box-image-editor-for-icons.md) é exibida.
Para obter informações sobre como adicionar recursos a projetos gerenciados, consulte [recursos em aplicativos de área de trabalho](/dotnet/framework/resources/index) na *guia do desenvolvedor do .NET Framework*. Para obter informações sobre como adicionar manualmente os arquivos de recursos a projetos gerenciados, acessar recursos, exibir recursos estáticos e atribuir cadeias de caracteres de recurso a propriedades, consulte [criando arquivos de recursos para aplicativos de área de trabalho](/dotnet/framework/resources/creating-resource-files-for-desktop-apps). Para obter informações sobre globalização e localização de recursos em aplicativos gerenciados, consulte [Globalizing e Localizando aplicativos do .NET Framework](/dotnet/standard/globalization-localization/index).
## <a name="requirements"></a>Requisitos
Nenhum
## <a name="see-also"></a>Consulte também
[Teclas de aceleração](../windows/accelerator-keys-image-editor-for-icons.md)<br/>
[Ícones e cursores: recursos de imagem para exibir dispositivos](../windows/icons-and-cursors-image-resources-for-display-devices-image-editor-for-icons.md) | 62.733333 | 783 | 0.780446 | por_Latn | 0.992918 |
fa3cfdf822e0b0c2f739e96a3fc0057e0a1e43d5 | 1,338 | md | Markdown | CONTRIBUTING.md | dylmeadows/lambdadepot | bccfce4199c7f8c4c8e39a729519609f5ee5fd4a | [
"Apache-2.0"
] | null | null | null | CONTRIBUTING.md | dylmeadows/lambdadepot | bccfce4199c7f8c4c8e39a729519609f5ee5fd4a | [
"Apache-2.0"
] | 1 | 2020-11-06T20:58:05.000Z | 2020-11-06T20:58:05.000Z | CONTRIBUTING.md | tom9carthron1/lambdadepot | 9c5217432cd78356093046071117c255ef5cdfac | [
"Apache-2.0"
] | 2 | 2020-11-10T05:17:49.000Z | 2021-01-13T16:46:56.000Z | # How to Contribute to lambdadepot
## Did you find a bug?
We use [GitHub issues](https://github.com/homedepot/lambdadepot/issues) to track bugs and enhancements.
If you are reporting a bug, please help to speed up problem diagnosis by providing as much information as possible. Ideally, that would include a small sample project that reproduces the problem.
## Did you write a patch that fixes a bug?
Open a new [GitHub pull request](https://github.com/homedepot/lambdadepot/pulls) with the patch. Ensure the PR description clearly describes the problem and solution.
### Did you fix whitespace, format code, or make a purely cosmetic patch?
Changes that are cosmetic in nature and do not add anything substantial to the stability, functionality, or testability of _**lambdadepot**_ will generally not be accepted.
### Pull Request Checklist
1. Run the `test` gradle task
* Verify no unit tests are failing
2. Run the `checkstyleMain` and `checkstyleTest` gradle tasks.
* Verify there are no checkstyle errors for `main` and `test` source sets
* Reports will be generated under the `build/reports/checkstyle` directory.
3. Run the `pitest` gradle task.
* Verify mutation test coverage is equal to or greater than existing mutation threshold
* A report will be generated the under `build/reports/pitest` directory. | 38.228571 | 195 | 0.772048 | eng_Latn | 0.997338 |
fa3d492d827bc51cc70667ec72956c09a46addbb | 8,497 | md | Markdown | docs/parsers/dig.md | lyterk/jc | 681176e4c958157ef1f2151b3e57963a6ba48e09 | [
"MIT"
] | null | null | null | docs/parsers/dig.md | lyterk/jc | 681176e4c958157ef1f2151b3e57963a6ba48e09 | [
"MIT"
] | null | null | null | docs/parsers/dig.md | lyterk/jc | 681176e4c958157ef1f2151b3e57963a6ba48e09 | [
"MIT"
] | null | null | null | [Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.dig"></a>
# jc.parsers.dig
jc - JSON CLI output utility `dig` command output parser
Options supported:
- `+noall +answer` options are supported in cases where only the answer
information is desired.
- `+axfr` option is supported on its own
The `when_epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
The `when_epoch_utc` calculated timestamp field is timezone-aware and is
only available if the timezone field is UTC.
Usage (cli):
$ dig example.com | jc --dig
or
$ jc dig example.com
Usage (module):
import jc
result = jc.parse('dig', dig_command_output)
or
import jc.parsers.dig
result = jc.parsers.dig.parse(dig_command_output)
Schema:
[
{
"id": integer,
"opcode": string,
"status": string,
"flags": [
string
],
"query_num": integer,
"answer_num": integer,
"authority_num": integer,
"additional_num": integer,
"axfr": [
{
"name": string,
"class": string,
"type": string,
"ttl": integer,
"data": string
}
],
"opt_pseudosection": {
"edns": {
"version": integer,
"flags": [
string
],
"udp": integer
},
"cookie": string
},
"question": {
"name": string,
"class": string,
"type": string
},
"answer": [
{
"name": string,
"class": string,
"type": string,
"ttl": integer,
"data": string
}
],
"additional": [
{
"name": string,
"class": string,
"type": string,
"ttl": integer,
"data": string
}
],
"authority": [
{
"name": string,
"class": string,
"type": string,
"ttl": integer,
"data": string
}
],
"query_size": integer,
"query_time": integer, # in msec
"server": string,
"when": string,
"when_epoch": integer, # [0]
"when_epoch_utc": integer, # [1]
"rcvd": integer
"size": string
}
]
[0] naive timestamp if "when" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
Examples:
$ dig example.com | jc --dig -p
[
{
"id": 2951,
"opcode": "QUERY",
"status": "NOERROR",
"flags": [
"qr",
"rd",
"ra"
],
"query_num": 1,
"answer_num": 1,
"authority_num": 0,
"additional_num": 1,
"opt_pseudosection": {
"edns": {
"version": 0,
"flags": [],
"udp": 4096
}
},
"question": {
"name": "example.com.",
"class": "IN",
"type": "A"
},
"answer": [
{
"name": "example.com.",
"class": "IN",
"type": "A",
"ttl": 39302,
"data": "93.184.216.34"
}
],
"query_time": 49,
"server": "2600:1700:bab0:d40::1#53(2600:1700:bab0:d40::1)",
"when": "Fri Apr 16 16:05:10 PDT 2021",
"rcvd": 56,
"when_epoch": 1618614310,
"when_epoch_utc": null
}
]
$ dig cnn.com www.cnn.com @205.251.194.64 | jc --dig -p -r
[
{
"id": "46052",
"opcode": "QUERY",
"status": "NOERROR",
"flags": [
"qr",
"rd",
"ra"
],
"query_num": "1",
"answer_num": "1",
"authority_num": "0",
"additional_num": "1",
"opt_pseudosection": {
"edns": {
"version": "0",
"flags": [],
"udp": "4096"
}
},
"question": {
"name": "example.com.",
"class": "IN",
"type": "A"
},
"answer": [
{
"name": "example.com.",
"class": "IN",
"type": "A",
"ttl": "40426",
"data": "93.184.216.34"
}
],
"query_time": "48 msec",
"server": "2600:1700:bab0:d40::1#53(2600:1700:bab0:d40::1)",
"when": "Fri Apr 16 16:06:12 PDT 2021",
"rcvd": "56"
}
]
$ dig -x 1.1.1.1 | jc --dig -p
[
{
"id": 20785,
"opcode": "QUERY",
"status": "NOERROR",
"flags": [
"qr",
"rd",
"ra"
],
"query_num": 1,
"answer_num": 1,
"authority_num": 0,
"additional_num": 1,
"opt_pseudosection": {
"edns": {
"version": 0,
"flags": [],
"udp": 4096
}
},
"question": {
"name": "1.1.1.1.in-addr.arpa.",
"class": "IN",
"type": "PTR"
},
"answer": [
{
"name": "1.1.1.1.in-addr.arpa.",
"class": "IN",
"type": "PTR",
"ttl": 1800,
"data": "one.one.one.one."
}
],
"query_time": 40,
"server": "2600:1700:bab0:d40::1#53(2600:1700:bab0:d40::1)",
"when": "Sat Apr 17 14:50:50 PDT 2021",
"rcvd": 78,
"when_epoch": 1618696250,
"when_epoch_utc": null
}
]
$ dig -x 1.1.1.1 | jc --dig -p -r
[
{
"id": "32644",
"opcode": "QUERY",
"status": "NOERROR",
"flags": [
"qr",
"rd",
"ra"
],
"query_num": "1",
"answer_num": "1",
"authority_num": "0",
"additional_num": "1",
"opt_pseudosection": {
"edns": {
"version": "0",
"flags": [],
"udp": "4096"
}
},
"question": {
"name": "1.1.1.1.in-addr.arpa.",
"class": "IN",
"type": "PTR"
},
"answer": [
{
"name": "1.1.1.1.in-addr.arpa.",
"class": "IN",
"type": "PTR",
"ttl": "1800",
"data": "one.one.one.one."
}
],
"query_time": "52 msec",
"server": "2600:1700:bab0:d40::1#53(2600:1700:bab0:d40::1)",
"when": "Sat Apr 17 14:51:46 PDT 2021",
"rcvd": "78"
}
]
$ dig +noall +answer cnn.com | jc --dig -p
[
{
"answer": [
{
"name": "cnn.com.",
"class": "IN",
"type": "A",
"ttl": 60,
"data": "151.101.193.67"
},
{
"name": "cnn.com.",
"class": "IN",
"type": "A",
"ttl": 60,
"data": "151.101.65.67"
},
{
"name": "cnn.com.",
"class": "IN",
"type": "A",
"ttl": 60,
"data": "151.101.1.67"
},
{
"name": "cnn.com.",
"class": "IN",
"type": "A",
"ttl": 60,
"data": "151.101.129.67"
}
]
}
]
<a id="jc.parsers.dig.parse"></a>
### parse
```python
def parse(data, raw=False, quiet=False)
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, aix, freebsd, darwin, win32, cygwin
Version 2.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
| 24.002825 | 72 | 0.378251 | eng_Latn | 0.273055 |
fa3d8e2d5b69dd99815fc475a8a7c328f388afdc | 36 | md | Markdown | README.md | ashwaninath999/Twitter-analysis | 4cff75ddeb2101d77c69ebfeae7cc67e004a2327 | [
"Apache-2.0"
] | null | null | null | README.md | ashwaninath999/Twitter-analysis | 4cff75ddeb2101d77c69ebfeae7cc67e004a2327 | [
"Apache-2.0"
] | null | null | null | README.md | ashwaninath999/Twitter-analysis | 4cff75ddeb2101d77c69ebfeae7cc67e004a2327 | [
"Apache-2.0"
] | null | null | null | # Twitter-analysis
Twitter comments
| 12 | 18 | 0.833333 | fra_Latn | 0.400201 |
fa3f17bcfd3b87a1cd8107ecf01a7e70a9d14c62 | 43,711 | md | Markdown | CHANGELOG.md | craigw/gds-api-adapters | 8942adafb259210bdf39e0d1b4e141bf7d6e3727 | [
"MIT"
] | null | null | null | CHANGELOG.md | craigw/gds-api-adapters | 8942adafb259210bdf39e0d1b4e141bf7d6e3727 | [
"MIT"
] | null | null | null | CHANGELOG.md | craigw/gds-api-adapters | 8942adafb259210bdf39e0d1b4e141bf7d6e3727 | [
"MIT"
] | null | null | null | ## 67.0.1
* URI encode parameters in Email Alert API methods
* Remove dependency on rack-cache gem
# 67.0.0
* Note: this release was misnumbered and should have been version 64 - there
are no versions 64, 65, 66 of this gem.
* BREAKING: Remove deprecated test helper methods that weren't prefixed with `stub_`
* BREAKING: Remove `GdsApi::PublishingApiV2` use `GdsApi::PublishingApi`
# 63.6.0
* Remove request body parameters from error reporting payload.
# 63.5.1
* Update Worldwide test helpers to use the website root
# 63.5.0
* Change Worldwide API requests to use the website root rather than whitehall-frontend
# 63.4.0
* Add `stub_any_publishing_api_unreserve_path` test helper.
* Fix Publishing API `stub_any_publishing_api_call` methods only operating on
the V2 endpoint.
# 63.3.0
* Change Worldwide API requests to be routed to whitehall-frontend by
default, rather than whitehall-admin. Update the test helpers
accordingly.
# 63.2.0
* Issue a warning when deprecated stub methods are called. `stub_*` methods
should be used instead.
* Remove methods incorrectly marked as being stubs, I don't expect any are used
externally so shouldn't be a breaking change.
* Add `get_live_content` method to `GdsApi::PublishingApi`
# 63.1.1
* Fix `GdsApi::TestHelpers::PublishingApiV2` not requiring Publishing API test helpers.
# 63.1.0
* Make success and error responses on Publishing API path methods accurate.
* Deprecate `stub_default_*` Publishing API test helpers in favour of the `stub_any_*` convention.
* Add `stub_publishing_api_path_reservation` test helper method.
* Combine `GdsApi::PublishingAPI` and `GdsApi::PublishingApiV2`
# 63.0.0
* BREAKING: Remove `GdsApi::Rummager`. Use `GdsApi::Search` instead.
# 62.0.0
* BREAKING: rename Email Alert API adapter method to create auth token
* Add Email Alert API adapter method to send subscription verification email
# 61.1.0
* Add stubs for email-alert-api edge cases (auth token and subscriber update)
* Extend the email-alert-api subscription stub to support an array of subscriptions
# 61.0.0
* Removes publishing pacts for released versions as they incorrectly referenced
unreleased changes
* Removes `Search#advanced_search`
# 60.1.0
* Update get_subscriptions endpoint for email-alert-api to accept order param
# 60.0.0
* Renames the Email Alert API `send_alert` method to `create_content_change` (and
related test helper methods) to reflect a change in the underling endpoint.
**Note:** this is a breaking change for users of the Email Alert API
adapters.
* Adds `create_email` Email Alert API method to send individual emails.
* Adds `create_message` Email Alert API method to send message emails out to a
subscriber list.
# 59.6.0
* Adds content_id to worldwide location test helper
* Adds middleware to allow the header `X-Govuk-Authenticated-User-Organisation`
to be passed along to content-store.
* Changes logging level about header forwarding from info to debug
# 59.5.1
* Adds `combine_mode` parameter to Email Alert API test helpers
# 59.5.0
* Adds `combine_mode` parameter to Email Alert API `find_subscriber_list` method
# 59.4.0
* Adds `stub_email_alert_api_has_subscriptions` test helper method.
* Ensures that `stub_email_alert_api_has_subscription` also stubs the `/latest` endpoint.
# 59.3.0
* Add `get_latest_matching_subscription` method to `GdsApi::EmailAlertApi`.
# 59.2.1
* Warn when `GdsApi::Rummager` is initialised and `GdsApi::TestHelpersRummager`
is included.
# 59.2.0
* Add `GdsApi::Search`, deprecate `GdsApi::Rummager`.
* Add `GdsApI::TestHelpers::Search`, deprecate `GdsApi::TestHelpers::Rummager`.
* Make `GdsApi.search` return a `GdsApi::Search`, deprecate `GdsApi.rummager`.
# 59.1.0
* Use `POST` rather than `GET` to perform anonymous feedback queries with
support-api.
# 59.0.0
* Expect applications to access the bank holidays API via the public website
and not internally via the calendars app.
# 58.0.0
* Rename references of subscribable to subscriber_list in Email Alert API. Note
this is a breaking change for users of the Email Alert API adapters, clients
will need to also update any references to subscribable.
# 57.5.0
* Support new unreserve_path endpoint for Publishing API (v1) adapter
* Remove reject_content_purpose_supergroup as optional email alert api subscriber list param
# 57.4.2
* Find rummager by using the "search" alias, rather than referring to "rummager" directly
# 57.4.1
* Add reject_content_purpose_supergroup as optional email alert api subscriber list param
# 57.4.0
* Add GdsApi::PublishingApiV2#republish endpoint to `publishing_api`
# 57.3.1
* Rename `create_business_finder_feedback` to `create_content_improvement_feedback`
# 57.3.0
* Add Router API stubs for getting routes.
# 57.2.4
* Add `content_purpose_supergroup` as optional parameter to `find_subscriber_list` in `GdsApi::EmailAlertApi`
* Add automatic bearer tokens for `GdsApi.router` and `GdsApi.content_store`.
# 57.2.3
* Add `create_business_finder_feedback` to `GdsApi::SupportApi`
* Add `stub_support_api_create_business_finder_feedback` to `GdsApi::TestHelpers::SupportApi`
# 57.2.2
* Fix more deprecated helpers (Asset Manager and Link Checker API)
* Fix stub_asset_manager_receives_an_asset not returning unique filenames each call
* Rename stub_asset_manager_is_down to stub_asset_manager_isnt_available
# 57.2.1
* Fix broken test helper for Publishing API
# 57.2.0
* Change all test helpers to use a stub\_ prefix (old names are now aliases)
# 57.1.0
* Add additional Asset Manager test helper methods
# 57.0.0
* Pass `LINK_CHECKER_API_BEARER_TOKEN` in Link Checker API requests if present.
* Pass `SUPPORT_API_BEARER_TOKEN` in Support API requests if present.
# 56.0.0
* Change the expected Publishing API behaviour regarding percent-encoding of URLs included in responses.
* Remove many deprecated methods
- `GdsApi::TestHelpers::Rummager.stub_any_rummager_post_with_queueing_enabled`
- Use `stub_any_rummager_post` instead
- `GdsApi::Rummager.delete_content!` and `GdsApi::Rummager.get_content!`
- Use `delete_content` and `get_content` respectively
- `GdsApi::PublishingApiV2.get_content!`
- Use `get_content` instead
- `GdsApi::PublishingApi.put_path`
- Use `GdsApi::PublishingApiV2.put_path` instead
- `GdsApi::ContentStore.get_content!`
- Use `get_content` instead
* Remove the `type` parameter from `GdsApi::Router.get_route`
* Remove the `GdsApi::Helpers` module
- Use the `GdsApi` module methods instead
# 55.0.2
* Add SocketError exception handling.
# 55.0.1
* Change how the default logger is assigned.
# 55.0.0
* Ensure new Publishing API patch_links stub is symbol/string-agnostic
* Change GdsApi.organisations adapter to use the public organisations API
* Fix the URL for Rummager batch queries.
# 54.1.3
* Extend Publishing API V2 unavailable stub to cover legacy V1 routes
* Extend Asset Manager not found stub to cover delete action
# 54.1.1
* Add extra Publishing API stub for conflict when patching links
# 54.1.0
* Add ability to batch search Rummager
# 54.0.0
* Expect `GdsApi::TestHelpers::Organisations` to be using the public API instead of Whitehall.
# 53.2.0
* Add methods to GdsApi to create instances of adapters with common options to reduce boilerplate code across apps
* Deprecate GdsApi::Helpers in favour of using explicit GdsApi.service_name methods
# 53.1.0
* Add Asset Manager test helpers: `asset_manager_update_asset`, `asset_manager_update_failure`, `asset_manager_delete_asset` and `asset_manager_delete_asset_failure`.
# 53.0.0
* Remove support for caching responses.
# 52.8.0
* Add support for the `unpublish-messages` endpoint in email-alert-api
# 52.7.0
* Expose the `country_name` parameter as part of the Mapit test helper
* Add `put_path` method to PublishingApiV2
# 52.6.0
* Add `generate` argument to `publishing_api_has_expanded_links` which reflects the behaviour of the actual request
more closely
# 52.5.1
* Make the subscription response for Email Alert API closer to reality.
# 52.5.0
* Add `create_auth_token` to `GdsApi::EmailAlertApi`.
# 52.4.0
* Add `unsubscribe_subscriber` to `GdsApi::EmailAlertApi`.
# 52.3.0
* Change `get_subscriptions` and `change_subscriber` to accept a subscriber ID rather than an email address
# 52.2.1
* Add a title parameter to the `get_subscription_response` stub for Email Alert API.
# 52.2.0
* Add `get_subscription`, `get_subscriptions`, `change_subscriber` and `change_subscription` to `GdsApi::EmailAlertApi`.
# 52.1.0
* Add `GdsApi::HTTPIntermittentServerError` and `GdsApi::HTTPIntermittentClientError` superclasses.
* Add a `GdsApi::HTTPTooManyRequests` exception
# 52.0.0
* Remove deprecated `notifications` and `notification` methods from `GdsApi::EmailAlertApi`.
# 51.4.0
* Add support for the /feedback-by-day endpoint in the Support API
# 51.3.0
* Add `locale` param to `get_expanded_links` in the Publishing API
client
# 51.2.0
* Add helper method `.redirect_for_path` to GdsApi::ContentStore to allow determining a redirect destination from a request
# 51.1.1
* Add frequency param to `assert_subscribed` in Email alert api test helpers
# 51.1.0
* Include frequency as a parameter when subscribing to emails through Email alert api
# 51.0.0
* **Breaking Change:** Require a minimum of Ruby 2.3
* Change name of `DISABLE_JSON_API_CACHE` environment variable to `GDS_API_DISABLE_CACHE`
# 50.9.1
* Percent encode URLs when requesting Whitehall assets from Asset Manager API
# 50.9.0
* Add Environment variable which can disable caching of JSON API requests, `DISABLE_JSON_API_CACHE`
# 50.8.0
* Add V2 api endpoints for GdsAPI::Rummager#delete_document
* Add V2 api endpoints for GdsAPI::Rummager#insert_document
# 50.7.0
* Add GdsApi::SupportApi#document_type_list to retrieve list of formats for content items
* Add GdsApi::SupportApi#document_type_summary to retrieve feedback associated with content items of a certain format.
# 50.6.0
* Add `with_drafts` optional parameter to GdsApi::PublishingApiV2#lookup_content_ids and GdsApi::PublishingApiV2#lookup_content_id
# 50.5.0
* Add #email_alert_api_refuses_to_create_subscription test helper for
email-alert-api to simulate an error condition when trying to create
a subscription.
# 50.4.0
* Add GdsApi::EmailAlertApi#get_subscribable to retrieve subscribable
(currently SubscriberList in the api) by `gov_delivery_id`
(called `reference:` here as we will be renaming it in the API)
# 50.3.0
* Add GdsApi::PublishingApiV2#get_content_items_enum to enumerate content items
# 50.2.0
* Add GdsApi::EmailAlertApi#subscribe to allow users to subscribe to emails
# 50.1.0
* Add GdsApi::EmailAlertApi#unsubscribe to allow users to unsubscribe from emails.
# 50.0.0
* Remove GdsApi::NeedApi
* Change GdsApi::Router.delete\_route to take an optional hard\_delete
argument, removing support for the deprecated type argument.
# 49.8.0
* Add GdsApi::Rummager#search_enum method to expose search results as an enumerator.
# 49.7.0
* Add GdsApi::LinkCheckerApi#upsert_resource_monitor method for creating/updating a collection of monitored links for an application.
# 49.6.0
* Add GdsApi::AssetManager#whitehall_asset method for retrieving Whitehall assets from Asset Manager.
# 49.5.0
* Allow rummager search to pass additional headers
# 49.4.0
* Document new optional `legacy_etag` & `legacy_last_modified` attributes that
can be passed into `GdsApi::AssetManager#create_whitehall_asset` within the
`asset` Hash (#760)
# 49.3.1
* Avoid the following warning: Overriding "Content-Type" header "application/json" with "multipart/form-data; boundary=----RubyFormBoundaryX7Na6WDQqG3kLfD7" due to payload
# 49.3.0
* Use 1.5.0 minimum of [govuk-content-schema-test-helpers](https://github.com/alphagov/govuk-content-schema-test-helpers)
* Remove gem_publisher dependency since rake task is no longer used to publish gem
# 49.2.0
* Add GdsApi::AssetManager#create_whitehall_asset method (#752)
# 49.1.0
* Remove trailing slash in call to get_link_changes
# 49.0.0
* Remove `GdsApi::GovUkDelivery` and helpers as `govuk-delivery` has been
retired in favour of `email-alert-api`
* Add get_link_changes endpoint for publishing-api
# 48.0.0
* Resurrect `feedback_url` for Support (removed in 46.0.0)
* Remove need-api helper
Need API is being retired
# 47.9.1
* Group Sentry errors by exception type
# 47.9.0
* Add the HTTPPayloadTooLarge exception
* Add `get_links_for_content_ids` endpoint to Publishing API
* Add more specific exceptions for HTTPInternalServerError (500), HTTPBadGateway (502), HTTPUnavailable (503), HTTPGatewayTimeout (504) exceptions.
# 47.8.0
* Add `email_alert_api.topic_matches`
# 47.7.0
* Separate `find_or_create_subscriber_list` so that individual `find` or
`create` methods can be called in email-alert-api.
# 47.6.0
* Add `generate` option for Publishing API expanded links endpoint
# 47.5.0
* Add `enable_list` and `disable_list` endpoints for govuk-delivery.
# 47.4.0
* Add a `get_paged_editions` endpoint, which returns a lazy enumerator that pages
through results from the editions endpoint.
# 47.3.0
* Update `publishing-api` class to support the new get editions endpoint.
# 47.2.1
* Send the update_type of special routes on put content rather than publish.
# 47.2.0
* Make passing the `update_type` to the Publishing API on a publish optional.
# 47.1.3
* Fix Publishing API lookup_content_id and lookup_content_ids to send
exclude_unpublishing_types rather than exclude_publishing_types.
# 47.1.2
* Pass exclude_document_types and exclude_publishing_types fields to to pubslishing APi
when calling `lookup_content_id`
# 47.1.1
* Fixes url used for fetching search metrics from `backdrop read API`
# 47.1.0
* Update `link-checker-api` class to support new message format.
# 47.0.0
* Remove support for `content-api`
* Add `HTTPUnprocessableEntity` exceptions
* Introduce 4 new endpoints for `backdrop read API` to be used by `info-frontend`.
# 46.0.0
* Drop dead endpoints from support adapter
* Rename test helpers for support-api to make it clearer they stub requests to support-api
* Delete methods no longer needed by `licence-finder`: `licences_for_ids`,
`content_api_licence_hash`, `setup_content_api_licences_stubs` and
`content_api_has_licence`.
# 45.0.0
* Add api adapter for the bank-holidays json provided by calendars
# 44.0.0
* Revert changes made to `update_type` in the Publishing API pact tests in release 43.0.0
* Remove support for the `business-support-api`
# 43.0.0
* Set the `update_type` to major in all of the Publishing API pact tests.
* Remove support for `business_support_schemes`
* Support custom schemas in the SpecialRoutePublisher
# 42.0.0
* Make `lgil` mandatory when requesting links from Local Links Manager
# 41.5.0
* Add missing `link-checker-api` test helpers from the previous release.
# 41.4.0
* Add support for the secure webhooks of the `link-checker-api`.
# 41.3.0
* Add support for the `link-checker-api`.
# 41.2.0
* Update webmock gem dependency
* Add new `locale` and `details` fields to special routes
# 41.1.0
* Add new fields to 'find_or_create_subscriber_list' to support whitehall migration
- email_document_supertype
- government_document_supertype
- gov_delivery_id
* Port all jenkins.sh steps to Jenkinsfile
# 41.0.0
* Rename GOVUK_FACT_CHECK_ID header to GOVUK_AUTH_BYPASS_ID header
# 40.5.0
* Add support to request expanded links from publishing api with or without drafts
- The default is false, for backward compatibility
- https://github.com/alphagov/gds-api-adapters/pull/676
- https://github.com/alphagov/publishing-api/blob/master/doc/api.md#query-string-parameters-2
# 40.4.0
* Add support for a customized `document_type` when publishing a special route,
but keep the default `document_type` of `special_route`.
# 40.3.0
* Allow headers to be passed into `EmailAlertApi.send_alert`. This change is
needed to pass the `govuk_request_id` when email alert service processes
messages off rabbit mq.
* Comment out pact broker jenkins tasks as the service is currently offline. This
change will be reverted once pact broker is working again.
# 40.2.0
* Add support for passing logging parameters through to Gov Uk Delivery.
# 40.1.0
* Add a redirects option to the Unpublish adapter in Publishing API
# 40.0.0
* Remove Panopticon API
* Include the files within the test/fixtures directory, this fixes
some test helpers that would have been broken since 39.0.0.
# 39.2.0
* Add support for the import endpoint for the Publishing API.
* The `publishing_api_has_item` test helper can now take a hash of
params to match the request against.
* Remove Rails specific features from the implementation of some
Publishing API test helpers.
# 39.1.0
* Pass through GOVUK_FACT_CHECK_ID header. This will be added by
authenticating-proxy when a draft item is requested with a valid JWT token;
the value itself will be checked by content-store against the value stored
in the content item.
# 39.0.0
* Remove the `need_api_has_organisations` test helper.
# 38.1.0
* Handle URI::InvalidURIError exceptions with GdsApi::InvalidUrl
* Removed tests for the deprecated `format` field in the Publishing API.
# 38.0.0
* Added an adapter for the import endpoint of the Need API.
* Removed `GdsApi::Publisher`
* Removed `GdsApi::ExternalLinkTracker`
* Removed deprecated `PublishingApi` endpoints (`put_content_item` and
`put_draft_content_item`).
* Removed `ContentStore#incoming_links!`
* Removed `ContentStore#content_item!`
* Removed `PublishingApiV2#get_content!`
* Renamed `Rummager#delete_content!` to `Rummager#delete_content`
* Renamed `Rummager#get_content!` to `Rummager#get_content`
# 37.5.1
* Allow #dig on Response.
# 37.5.0
* Support `unpublished_at` field for `PublishingApiV2#unpublish`
# 37.4.0
* Add test helper methods to stub any unpublish or discard draft requests to
Publishing API V2.
# 37.3.0
* Add a helper method for extracting service feedback from the performance
platform.
# 37.2.0
* Add test helper method `publishing_api_has_linked_items` for the Publishing
API V2 method `get_linked_items`.
# 37.1.0
* Add `restore_asset` method to allow the restoration of deleted assets.
# 37.0.0
* Default `always_raise_for_not_found` to true when not configured and add
deprecation warning for when a client app uses the setter to change the value.
From December 1st, 2016 it won't be possible to configure this option anymore
and therefore all responses will raise a `GdsApi::HTTPNotFound` for 404s and
`GdsApi::HTTPGone` for 410s;
* Default `hash_response_for_requests` to true when not configured and add
deprecation warning for when a client app uses the setter to change the value.
From December 1st, 2016 it won't be possible to configure this option anymore
and therefore all responses will behave like a `Hash` instead of an
`OpenStruct`;
* Add helper methods to stub and assert Rummager searches;
* Stop using `content_format` in the Publishing API tests;
* Documentation added to `get_content_items`;
* Ruby version upgraded to `2.3.1`;
* Added `govuk-lint` to the project.
# 36.4.1
* Fix bug where the total number of pages was being calculated incorrectly on
`publishing_api_has_content`;
* Return only the expected items based on the pagination parameters on
`publishing_api_has_content`.
# 36.4.0
* Remove search-related fields from Panopticon Registerer now that these fields
are no longer sent by client apps.
# 36.3.0
* Add a support-api endpoint for creating 'Page Improvements'.
# 36.2.0
* Add delete_asset method, to support the new delete asset functionality now supported by asset manager.
* Fix issue where rspec style matchers would cause issues, since they do not implement the fetch method.
# 36.1.0
* Add helpers for the support-api for fetching problem reports and
for marking problem reports as spam.
# 36.0.1
* Add option to return results as a hash in `GdsApi::ListResponse`.
# 36.0.0
* Remove `GdsApi::Rummager#unified_search`. The `/unified_search` endpoint
has been removed in rummager in favor of `/search`.
**This is a breaking change**, which means applications currently using
`#unified_search` need to migrate to `#search`.
# 35.0.1
* Fix issue where Pact would hit the Publishing API in development if the
service was running on the same port `3093`;
* Return pagination information from `publishing_api_has_content` in order to
reflect what would happen in a real request.
# 35.0.0
* Remove methods for `with_tag` endpoint for content api. These methods are not
used by any client. The endpoint is scheduled to be removed soon.
* Add test helper for Gone items in content store
# 34.1.0
* Deprecate `GdsApi::Rummager#unified_search`. The `/unified_search` endpoint
has been deprecated in rummager in favor of `/search`.
# 34.0.0
* De-deprecate `delete_document` helpers, because the endpoint is still useful.
* Allow `/document/` helpers to take an optional index parameter, mirroring the API.
* Allow all assert methods to pass through additional webmock options
* Change `stub_any_rummager_post` to behave the same as
`stub_any_rummager_post_with_queueing_enabled`: rummager always returns 202 and
so should our stubs.
* Deprecate `stub_any_rummager_post_with_queueing_enabled` as it is now redundant.
# 33.2.2
* Fix JsonClient not explicitly requiring the config
# 33.2.1
* Send correct headers for GET and DELETE requests.
* Extend option to always raise for 404 and 410 to `get_raw`.
# 33.2.0
* Update RestClient version to 2.0.0
# 33.1.0
* Add support-api global export endpoint
# 33.0.0
* Simplify state name presentation (live to published)
# 32.3.0
* Add option to always raise for 404 and 410.
* Add option to make `GdsApi::Response` just behave like a hash, not an OpenStruct
* Add local links manager local authority endpoint
# 32.2.1
* Fix LocalLinksManager test URLs
# 32.2.0
* Add LocalLinksManager `local_link` adapter.
# 32.1.0
* Add `publishing_api_has_item_in_sequence` test helper
# 32.0.0
* Allow publishing apps to request a specific version of content.
* Provide a warning message for rummager deprecated stubs
* Modify Rummager test helper to use `rummager` vhost instead of `search`.
**This is a breaking change**, which means applications currently using the
helper will need to ensure they are using the `rummager` vhost when creating
the adapter.
# 31.4.0
* Add `allow_draft` flag that can optionally be set when unpublishing
# 31.3.0
* Add `stub_any_rummager_delete_content` and `assert_rummager_deleted_content`
# 31.2.0
* Add an `area_for_code` method to the MapIt API adapter.
# 31.1.0
* Add a `stub_any_publishing_api_publish` test helper
# 31.0.0
* Remove `format` from expected results in Pact tests.
# 30.9.0
* Add `assert_publishing_api_unpublish` test-helper
# 30.8.0
* Stubs successful and failing attachment uploads to asset manager.
# 30.7.0
* Allow `bulk_publishing` flag for `PublishingApiV2#patch_links`
# 30.6.0
* Use `document_type` & `schema_name` in special routes
* Remove implicit dependency between JsonClient and GdsApi::Base
* Add support for get_expanded_links endpoint
# 30.5.0
* Add `locale` parameter to unpublish endpoint.
* Add publishing-api `stub_publishing_api_unpublish` test helper method.
# 30.4.0
* Add publishing API POST /v2/content/:content_id/unpublish endpoint
See docs for more info: https://github.com/alphagov/publishing-api/blob/master/doc/api.md#post-v2contentcontent_idunpublish
# 30.3.0
* Add email alert API `GET notifications` and `GET notification` endpoints
# 30.2.1
* Update README and add documentation. See http://www.rubydoc.info/gems/gds-api-adapters/GdsApi/PublishingApiV2 for example.
* Remove content-register
# 30.2.0
* Add `GdsApi::GovukHeaders.clear_headers`
# 30.1.0
* Add stub for `publishing_api_get_content` in `publishing_api_v2` test helper.
# 30.0.1
* Extend publishing API 'stub_publishing_api_publish' test helper to accept a response hash
# 30.0.0
* Change test helper
* Wrap resultant calls to GET /v2/content in meta data
# 29.6.0
* Add `GdsApi::PublishingApiV2#lookup_content_id` and `GdsApi::PublishingApiV2#lookup_content_id`
* Add `publishing_api_has_lookups` test helpers
# 29.5.0
* Add `GdsApi::HTTPUnprocessableEntity` to represent a 422 error
# 29.4.0
* Support searching by a `document_type` in email-alert-api
# 29.3.1
* Fix linkables test helper.
# 29.3.0
* Prefer `document_type` as the arg for `get_linkables`, deprecating `format`.
# 29.2.0
* Enable Draft Content Store testing in test helpers.
# 29.1.1
* Fix publishing API `patch_links` test helpers.
# 29.1.0
* Add code and pact tests for the new publishing API `GET /v2/linkables` endpoint.
# 29.0.0
* Breaking change: rename `PublishingApiV2`'s `put_links` to `patch_links` to
better reflect the behaviour. Also rename related test_helper methods.
# 28.3.1
* Fixed `TestHelpers::Imminence` missing `end`.
# 28.3.0
* `TestHelpers::Imminence` now has `imminence_has_places_for_postcode` and
`stub_imminence_places_request` helper methods. There was previously no helper
for the find places with post code tools. It is now possible to stub all requests
for places with any status code or payload as required.
# 28.2.1
* `TestHelpers::PublishingApiV2` now has a `publishing_api_does_not_have_links` test helper
which stubs the Publishing API V2 to return the 404 payload for the `content_id` passed
as the arg.
# 28.2.0
* Pass the Govuk-Original-Url header on to requests made by gds-api-adapters,
similarly to the existing Govuk-Request-Id header. Rails applications will
get this support automatically when using this version of gds-api-adapters.
Other applications will need to add an explicit call to `use
GdsApi::GovukHeaderSniffer, 'HTTP_GOVUK_ORIGINAL_URL"`, as detailed in the
README.
# 28.1.1
* `TestHelpers::PublishingApiV2` now has a `publishing_api_does_not_have_item` test helper
which stubs the Publishing API V2 to return the 404 payload for the `content_id` passed
as the arg.
# 28.1.0
* Add `PublishingApiV2#get_content!`
# 28.0.2
* `TestHelpers::PublishingApiV2` now has a `publishing_api_has_links` test helper
which stubs the Publishing API V2 to return the links payload which is supplied
as the arg.
# 28.0.1
* In `TestHelpers::PublishingApiV2` - for methods that accept an optional arg
`attributes_or_matcher`, change the default value to nil so that we don't test
for request body if the argument is not supplied.
* In the same class, fix `#publishing_api_has_fields_for_format` so that we
always cast the `item` arg to Array. This correctly handles cases where only
one item is passed in.
# 28.0.0
* Drop support for Ruby 1.9.3
* Raise `HTTPUnauthorized` for 401 responses
# 27.2.2
* Fix `ContentStore#incoming_links`.
# 27.2.1
* `PublishingAPIV2`: prevent clients from using nil `content_id` values
# 27.2.0
* Add some useful test helpers for EmailAlertApi
# 27.1.1
* Pin Plek to >= 1.9.0 because gds-api-adapters depends on Plek.find
# 27.1.0
* Add Plek service discovery for EmailAlertApi test helpers
# 27.0.0
* Fix issue within `PublishingApiV2` test helpers where
`request_json_matching` and `request_json_including` were incorrectly
named and had the opposite behaviour.
* The default behaviour of `assert_publishing_api` (and the more specific
helpers that use it) is not to match the entire supplied attributes.
To do partial matching use `request_json_includes`
* Add support for symbol keys to the `PubishingApiV2` test helpers.
# 26.7.0
* Add support for Rummager's `delete_content` & `get_content`.
# 26.6.0
* Add `PublishingApiV2#get_linked_items`.
# 26.5.0
* Changed SpecialRoutePublisher to use v2 of publishing-api
# 26.4.0
* Performance Platform: add test helper stub for non-existent datasets
* Add `ContentStore#incoming_links!`
# 26.3.1
* Fix the composite "put and publish" Publishing API v2 stub
# 26.3.0
* Publishing API v2: add stub for 404 responses
# 26.2.0
* Support optional locale and previous version for discard_draft publishing API call
# 26.1.0
* Add publishing api discard draft endpoint
# 26.0.0
* Flesh out and rename methods in Publishing API v2 test helpers
# 25.3.0
* Add Test Helpers for Publishing API V2 `index` and `get`
# 25.2.0
* Add `PublishingApiV2#get_content_items`.
# 25.1.0
* Add support for optimistic locking to the v2 publishing API endpoints.
# 25.0.0
* Allow `Mapit#location_for_postcode` to raise if it receives something which doesn't look like a postcode.
# 24.8.0
* Add test helpers for Publishing-API path reservation endpoint
# 24.7.0
* Add put_path endpoint for Publishing-API
# 24.6.0
* Support segments_mode option for add_redirect_route.
# 24.5.0
* Adds the helper methods and pact tests for the GET and PUT publishing API endpoints for managing links
independently of content items.
# 24.4.0
* Set the connection timeout in RestClient as well as the read timeout
# 24.3.0
* Raise `HTTPConflict` exception for HTTP 409 status codes.
# 24.2.0
* Change the Panopticon Registerer adapter to support the `content_id` field.
# 24.1.0
* Add test helper `content_register_isnt_available`
# 24.0.0
* Remove support for the `/organisations` endpoint on Rummager.
# 23.2.2
* Bugfix: `SpecialRoutePublisher` handles case where `Time.zone` returns `nil`
# 23.2.1
* Bugfix: remove invalid require from GdsApi::Helpers
# 23.2.0
* Add special route publisher under PublishingApi.
This is be used in several apps for registering
"special" routes like /government or /robots.txt
See https://trello.com/c/blLdEZN5/292-make-apps-register-special-routes-on-deploy
# 23.1.0
* GdsApi::TestHelpers::PublishingApi
added the ability to make more flexible assertions about publishing api
requests by optionally passing a predicate to the assertions. The
`request_json_including` predicate will match required elements of a hash or
a nested hash in the JSON body.
The existing stricter behaviour is retained as the default
(`request_json_matching` predicate).
# 23.0.0
* Remove `GdsApi::Rummager#search`. The `/search` endpoint was removed
from rummager in favor of `/unified_search`.
# 22.0.0
* Remove `FinderAPI` and `FinderSchema` classes.
Finder API has been retired and these are no longer used.
* Raise specific error on 404 in content-store client.
# 21.0.0
* Using GdsApi::ContentApi#tag without the second parameter for tag type will
raise an error.
# 20.1.2
* Fix content-store test stubs key type
* Fix GdsApi::Response `method_missing` signature
# 20.1.1
* Fix stray collections-api require
# 20.1.0
* Change the user agent string to include the app name, eg:
`GDS Api Client v. 20.1.0` -> `gds-api-adapters/20.1.0 (whitehall)`
# 20.0.0
* remove collections-api client and helpers.
The application has been retired.
* Don't cache Cache-Control "private" and "no-store" responses.
* Update content_store_has_item test helper to support better overriding of
Cache-Control headers in responses.
# 19.2.0
* Raise HTTPForbidden on a 403 response.
# 19.1.0
* Pass HTTP_X_GOVUK_AUTHENTICATED_USER in API requests.
# 19.0.0
* Remove old policy test helpers for rummager.
# 18.11.0
* Add support for organisation list and show in Support API
# 18.10.0
* Add support for exporting CSVs from the Support API
# 18.9.1
* Fix a bug in the SupportApi test helper for organisations_list
# 18.9.0
* Support API: add adapters for `/anonymous-feedback/organisations` list, and
`/anonymous-feedback/organisations/:slug` feedback summary endpoints.
# 18.8.0
* Support API: add adapter for `/anonymous-feedback` feed endpoint
# 18.7.0
* Change name of Rummager policy-tagging test helpers to reflect the fact that
they stub for any type. Deprecates the old helpers.
# 18.6.0
* Change Rummager test helpers to allow stubbing specific counts for requests
for policies for an organisation.
# 18.5.0
* Add Rummager test helpers to stub requests for policies for an organisation.
# 18.4.0
* Make TestHelpers::Organisations more flexible
# 18.3.1
* Fix `content_api_has_artefacts_with_a_tag` test helper
# 18.3.0
* Add assert_publishing_api_put_draft_item to publishing_api_helpers
* Bump rest-client for security fixes
# 18.2.0
* Remove base_path from publishing_api content_item_helpers.
* Add helper to stub draft publishing API default.
# 18.1.0
* Adds support for the PUT endpoint of content register
* Test helpers for the `GdsApi::ContentRegister` adapter
# 18.0.0
* Update rest-client dependency for security fixes: https://github.com/rest-client/rest-client/commit/221e3f200f76bd1499591fbc6c3ea3f6183b66ef
* Publishing API test helpers responses no longer include the entities
* Government API test helpers responses include the content_id
# 17.6.0
* Add publishing API method to `PUT` draft content items, to be stored only in draft content-store.
# 17.5.0
* Add ability to pass `n` to some PublishingAPI test helpers to say how many times
the request should be expected.
# 17.4.0
* Add delete helpers to GdsApi::TestHelpers::Rummager
# 17.3.0
* Deprecate passing `type` to `GdsApi::Router#get_route` and `GdsApi::Router#delete_route`
# 17.2.0
* Add PublishingApi intent endpoints and test helpers.
# 17.1.0
* Add a test helper to stub Rummager's behaviour when queueing is enabled.
# 17.0.1
* Change the order of the ContentAPI `tags` request stubs as the first matching
stub is used.
* Loosens the live `tags` stub to allow cachebust, as per the draft version.
# 17.0.0
* Change the matching behaviour of ContentAPI test helpers to loosen their
param requirements.
* Add a `bust_cache` option for the ContentAPI `tags` endpoint.
# 16.5.0
* Add Whitehall and Publisher endpoints for reindexing editions tagged to topics.
# 16.4.0
* Change the Panopticon Registerer adapter to support the `public_timestamp`
and `latest_change_note` fields.
# 16.3.4
* Update content API test helper `#content_api_has_artefacts_with_a_tag` to
guard against missing keys
# 16.3.3
* Extend content API test helper `#content_api_has_artefacts_with_a_tag` to
support options for artefacts.
# 16.3.2
* Update collections API test helpers to reflect reality.
# 16.3.1
* Add test helper method to stub gone route registration.
# 16.3.0
* Add support for registering gone routes with the router.
# 16.2.0
* Add `start` and `total` options to the Collections API test helper.
# 16.1.0
* Allow the `start` and `count` arguments to be provided to the Collections API
adapter for a sub-topic.
# 16.0.0
* Pass correct parameters to assert_requested in email alert API endpoint helpers.
* Include latest changes in collections API endpoint.
# 15.2.0
* Add Rummager test helpers for adding documents
# 15.1.1
* Change e-mail alert API endpoints
# 15.1.0
* Add panopticon support for `primary_section` and `sections`
* Deprecate support for `section`
# 15.0.0
* Reduce public interface for Email Alert API (unused)
# 14.11.0
* Add support for endpoints to create, update and publish tags in Panopticon
# 14.10.0
* Add email alert API support
# 14.9.0
* Add problem reports endpoint to `support-api`
# 14.8.0
* Add `support-api` endpoint for getting problem report daily totals
* Add Performance Platform endpoint for uploading problem report daily totals
# 14.7.0
* Add long-form contact endpoint to `support-api`
# 14.6.0
* Add extra layer of inheritance for HTTP Exception classes to provide HTTPServerError and HTTPClientError in order to allow applications to to catch ranges of Server/Client type errors.
* Add a helper method, `build_specific_http_error`, in order to handle raising specific error types based on HTTP error codes.
# 14.5.0
* Add `custom_matcher` parameter to panopticon `stub_artefact_registration` to allow partial matching of request body.
# 14.4.2
* Corrects the endpoint for `#collections_api_has_no_curated_lists_for` test helper
# 14.4.1
* Corrects the collections api endpoint for curated lists
# 14.4.0
* Add more content api draft tag functions.
# 14.3.0
* Add "does not have" stubs for content api tags.
* Expand collections api test helpers.
# 14.2.0
* Adds basic collections API client and test helpers
# 14.1.1
* Update the content API sorted tags test helper to support draft mode.
# 14.1.0
* Add rummager test helper for stubbing sub-sector organisations (`rummager_has_specialist_sector_organisations`).
# 14.0.0
* Split content item write API from the content store client into a new publishing API client.
# 13.0.0
* `FinderSchema#user_friendly_values` now returns a hash with the slug version
of the attribute as the key, with a label and a values Array, which contains
a label and slug version of each value.
# 12.5.0
* Add test helper for content store being unavailable
# 12.4.2
* Stub "everything including draft" calls for tag test helpers
even when the helpers themselves aren't setting up draft tags.
# 12.4.1
* Fixes services and info data fixture
# 12.4.0
* Add `rummager_has_no_services_and_info_data_for_organisation`
# 12.3.0
* Add "draft" option to the content_api.tags method.
# 12.2.0
Add .expires_in, .expires_at to GDSApi::Response
These methods expose the expiration time in seconds
and absolute time value respectively, by inferring
them from max-age or expires values received in
response from content store.
# 12.1.0
* Add rummager test helpers
# 12.0.0
* `FinderSchema#user_friendly_facet_value` always returns metadata values as Arrays.
# 11.6.0
* Support the `sort` parameter on the Content API adapter's `tags` and `child_tags` methods.
* Include test for Rummager adapter behaviour on a response with a 422 status.
# 11.5.0
* Add `support-api` adapter
# 11.4.0
* Improve content-store test helpers
# 11.3.0
* Add `add_document` and `remove_document` methods to Rummager adapter.
# 11.2.0
* Add content-store test assertion helper.
# 11.1.0
* Adds areas_for_type method for Imminence.
# 11.0.0
* BREAKING CHANGE: router client no longer commits by default (see
https://github.com/alphagov/gds-api-adapters/commit/f7a6f5e for more details).
* Added more test helpers for router client
# 10.17.1
* Bug fix: remove `put_content_item`, we want to be aware of `PUT` failing on content store
# 10.17.0
* Add methods for PUTing data to content-store.
# 10.16.0
* Lookup Mapit areas by postcode via Imminence.
# 10.15.1
* Fix inheritance of content-store client.
# 10.14.0
* Update business support scheme helpers to match API behaviour.
# 10.13.0
* Expose Mapit areas by area type. Adds the method ```areas_for_type(type)```.
# 10.11.2
* Update organisation test helper, adding logo and brand class details to the helper
# 10.11.1
* Add router test helpers
# 10.11.0
* Add Maslow adapter with link builder
# 10.10.1
* Add panopticon artefact registration test helper
# 10.10.0
* Add support for registering multiple need_ids with panopticon
# 10.9.0
* Add support for `PUT` multipart requests
* Add support for replacing assets in asset manager using `PUT`
# 10.8.0
* Data-in API (corporate content problem reports): endpoints for counts, top urls
# 10.7.0
* Include organisation slugs if available when registering artefacts
# 10.6.4
* Add an `artefact!` method to Content API adaptor that can raise exceptions
# 10.6.3
* Added `needs_by_id` method to Need API adaptor for retrieving multiple needs with one request
# 10.6.2
* Data-in API (service feedback): specify which dataset is missing in the exception
# 10.6.1
* Fix bug with default schema factory for finder API
# 10.6.0
* Add more methods to interact with the finder API schema response
# 10.5.0
* Add new unified search endpoint for Rummager adapter
# 10.4.0
* Added method for finder API schema endpoint
# 10.3.0
* Added client for interacting with the GOV.UK [finder API](https://github.com/alphagov/finder-api).
* Added support for array parameters in query strings (eg `foo[]=bar&foo[]=baz`)
# 10.2.0
* Modify test helpers to match changes to `web_url` and `tag_id` in Content API.
* Add test helper for stubbing artefacts with multiple tags.
# 10.1.0
* Added client for interacting with the GOV.UK [external link tracker](https://github.com/alphagov/external-link-tracker).
# 10.0.0
* Query business support schemes by faceted search parameters e.g. ```locations=scotland,england```.
* Remove the ability to retrieve by identifiers.
# 9.0.0
* Remove obsolete `curated_lists` method from Panopticon API.
# 8.4.1
* Rename `industry_sectors` attribute to `specialist_sectors` to match Panopticon
# 8.4.0
* Add deep-link to anonymous feedback in Feedex.
# 8.3.2
* Update to a more sensible Performance Platform DataIn service feedback endpoint.
# 8.3.1
* Bugfix to constructing the Performance Platform DataIn service feedback endpoint.
# 8.3.0
* Add the Performance Platform DataIn service feedback endpoint, for uploading service feedback aggregated stats.
# 8.2.3
* Allow the `industry_sectors` attribute to be provided to the Panopticon registerer.
* New Content API test helper added for stubbing `with_tag.json` request with a custom sort order.
* The Content API tag tests now use test helpers to stub endpoints.
* Removed the `include_children` parameter from Content API adapters, which was removed from the Content API in April '13.
* Fix for the `content_api_has_artefacts_with_a_tag` helper to not expect query string parameters in an order when stubbing URLs.
* Fix for a typo in a test helper.
# 8.2.2
* Changes the test helpers for stubbing out Content API requests for artefacts with section tags so that they work for any tag type.
# 8.2.1
* Fix a bug where `gds_api/govuk_request_id.rb` would fail to load if the `GdsApi` module was not already defined.
# 8.2.0
* Add a method to re-open closed needs in the need API.
# 8.1.0
* We've added a unique request ID called `GOVUK-Request-Id` at the varnish layer so that it's easier to trace a request moving through the GOV.UK application stack. This change ensures that all API calls pass on the `GOVUK-Request-Id` header. More details in [README.md](https://github.com/alphagov/gds-api-adapters#middleware-for-request-tracing).
# 8.0.0
* Changes to the Content API adapter to decouple tag methods from the `section` tag type.
* Changes to Content API test helper stub data which may break tests in clients.
# 7.5.1
* Support app: problem report creation happens on `/anonymous_feedback/problem_reports` instead of `/problem_reports`
# 7.3.0
* Integrate Support app API for creating FOI requests
# 7.1.0
* Add Rummager method for `/organisations.json`
# 7.0.0
* Support arbitrary search parameters in `GdsApi::Rummager`
* Remove obsolete format_filter param from `GdsApi::Rummager` methods
* Remove obsolete autocomplete method from `GdsApi::Rummager`
# 6.1.0
* Add Content API method for `/artefacts.json`
# 6.0.0
Potentially backwards-incompatible changes:
* The `disable_timeout` option has been removed.
* `JsonClient` now respects the `Expires` headers when caching results. If no `Expires` header is set, the global cache TTL will be used (defaults to 15 mins).
* The Rummager client now inherits from `GdsApi::Base`. This means that it uses `JsonClient` and therefore inherits its timeout and caching behaviour.
Other changes:
* Added Worldwide API client.
| 26.427449 | 348 | 0.76093 | eng_Latn | 0.957539 |
fa3f7bc23888f0e513b8b011145b3bf2e6bbb7d4 | 2,104 | md | Markdown | _posts/2021-03-01-SENet.md | heedaeKwon/heedaeKwon.github.io | c62b2cf04907c25483988838f984a26f8bbd937e | [
"MIT"
] | null | null | null | _posts/2021-03-01-SENet.md | heedaeKwon/heedaeKwon.github.io | c62b2cf04907c25483988838f984a26f8bbd937e | [
"MIT"
] | null | null | null | _posts/2021-03-01-SENet.md | heedaeKwon/heedaeKwon.github.io | c62b2cf04907c25483988838f984a26f8bbd937e | [
"MIT"
] | 1 | 2021-01-10T09:49:32.000Z | 2021-01-10T09:49:32.000Z | # **Squeeze-and -Excitation Networks**
___
**1. INTRODUCTION**
**2. RELATED WORK**
**3. SQUEEZE AND EXCITATION**
**4. MODEL AND COMPUTATIONAL COMPLEXITY**
**5. EXPERIMENTS**
___
## **Introduction**
___
###### 이 논문에서는 채널간의 관계에 대해 조사했습니다.
###### 그리고 새로운 unit을 소개하는데 바로 squeeze and excitation입니다.
###### 이것의 장점은 simple합니다. 그리고 computationnally lightweight이고,
###### model compexity와 computational burden을 slight increase 시킵니다
## **Related work**
___
###### Deeper architectures
###### Algorithmic Architecture Search
###### Attention and gatin mechanisms가있습니다.
## **Squeeze and Exitation**
___
###### Squeeze : 말그대로 짜내는 연산입니다.
###### 용도로는 중요한 정보만 추출해서 가져가려는 것 입니다.

###### excitation : 스퀴즈한것을 재조정 하는 것입니다.
###### excitation과정에서 비선형함수를 사용합니다.


## **Model and computational complexity**
###### SEblock은 performance와complexity와 trade off 관계입니다.
###### model이 깊어지면 복잡도도 올라가고 퍼포먼스가 향상합니다.
###### 그래서 논문은 ResNet-50과 SE-ResNet -50으로 비교를 하는데
###### single forward pass 와 backward 걸리는시간은
|ResNet-50|SE-ResNet-50|
|------|---|
|190ms|209ms|
###### 그리고 parameter는
|ResNet-50|SE-ResNet-50|
|------|---|
|25M|27.5M|
###### 으로 2.5M개 더많다
###### 근데 performance를 보면은

###### 으로 SE-ResNet-50이 훨씬 좋은 performance를 보여준다.
## **Implementation**
---
###### data augmentation
###### 224x224 random crop , rando, horizontal flipping
|Hyperparameter|value|
|------|---|
|training data|1.28M|
|validation|50K|
|error rate|Top-1, Top-5|
|Learning rate|0.6,every 30epochs decrease 10|
|epochs|100|
###### 결과를 보면 SE-block을 사용한 것이 좋은 결과를 보여주었습니다.

| 22.147368 | 110 | 0.697719 | kor_Hang | 0.98047 |
fa3fe3fe2afbb516eacea073e2c7f7dee33b60a2 | 1,664 | md | Markdown | translations/pt-BR/content/organizations/restricting-access-to-your-organizations-data/approving-oauth-apps-for-your-organization.md | Varshans2/docs | b9dc0417694f59b81f2f597bf72bdf116d05f88a | [
"CC-BY-4.0",
"MIT"
] | 6 | 2021-02-17T03:31:27.000Z | 2021-09-11T04:17:57.000Z | translations/pt-BR/content/organizations/restricting-access-to-your-organizations-data/approving-oauth-apps-for-your-organization.md | Honeyk25/docs | de643512095e283cb3a56243c1a7adcf680a1d08 | [
"CC-BY-4.0",
"MIT"
] | 144 | 2021-10-21T04:41:09.000Z | 2022-03-30T09:55:16.000Z | translations/pt-BR/content/organizations/restricting-access-to-your-organizations-data/approving-oauth-apps-for-your-organization.md | Honeyk25/docs | de643512095e283cb3a56243c1a7adcf680a1d08 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2022-03-08T03:54:20.000Z | 2022-03-15T06:28:15.000Z | ---
title: Aprovar aplicativos OAuth na organização
intro: 'Quando uma integrante da organização solicita acesso do {% data variables.product.prodname_oauth_app %} aos recursos da organização, os proprietários da organização podem aprová-la ou negá-la.'
redirect_from:
- /articles/approving-third-party-applications-for-your-organization/
- /articles/approving-oauth-apps-for-your-organization
- /github/setting-up-and-managing-organizations-and-teams/approving-oauth-apps-for-your-organization
versions:
fpt: '*'
topics:
- Organizations
- Teams
shortTitle: Aprovar aplicativos OAuth
---
Quando as restrições de acesso do {% data variables.product.prodname_oauth_app %} são habilitadas, os integrantes da organização devem [solicitar aprovação](/articles/requesting-organization-approval-for-oauth-apps) de um proprietário da organização para que eles possam autorizar um {% data variables.product.prodname_oauth_app %} que tenha acesso aos recursos da organização.
{% data reusables.profile.access_org %}
{% data reusables.profile.org_settings %}
{% data reusables.organizations.oauth_app_access %}
5. Ao lado do aplicativo que deseja aprovar, clique em **Review** (Revisar). 
6. Depois que revisar as informações sobre o aplicativo solicitado, clique em **Grant access** (Conceder acesso). 
## Leia mais
- "[Sobre restrições de acesso do {% data variables.product.prodname_oauth_app %}](/articles/about-oauth-app-access-restrictions)"
| 61.62963 | 377 | 0.790264 | por_Latn | 0.925991 |
fa3febdeacf424b7b0e8ed968bf49c04cb15c0b6 | 3,275 | md | Markdown | content/tutorial/notebooks/dual_energy/_index.en.md | JeanBilheux/neutronimaging_pages_ornl_gov_with_hugo | 13679ad48de3e800cf4c50a69b2f98cc104d28df | [
"MIT"
] | null | null | null | content/tutorial/notebooks/dual_energy/_index.en.md | JeanBilheux/neutronimaging_pages_ornl_gov_with_hugo | 13679ad48de3e800cf4c50a69b2f98cc104d28df | [
"MIT"
] | null | null | null | content/tutorial/notebooks/dual_energy/_index.en.md | JeanBilheux/neutronimaging_pages_ornl_gov_with_hugo | 13679ad48de3e800cf4c50a69b2f98cc104d28df | [
"MIT"
] | null | null | null | ---
title: Dual Energy
post: " <i class='fa fa-battery-full'></i>"
pre: "- "
---
**Notebook name**: dual_energy.ipynb
## Description
This notebook provides an automated way to determine the best contrast within an energy range of a region of
interest selected.
((add link to paper describing the technique here))
### Select your IPTS
{{% notice info %}}
Need help using the [IPTS selector]({{%relref "/tutorial/notebooks/select_ipts/_index.md#activate-search" %}})?
{{% /notice %}}
### Select data input folder
Select the input data folder. This folder should contain all the images recorded by the MCP detector. By default this
folder will contain the time spectra file needed by the program. If it does not, a new window will ask you to browse
for this file once the data have been loaded.
Using the [folder selection tool]({{%relref "/tutorial/notebooks/file_selector/_index.md#activate-search" %}}), select
the folder that contains the images you need to work on.
### launch the user interface (UI)
A user interface will show up and may be hidden behind the current browser, make sure you check for it if you don't see
it coming to life.
<img here>
This UI will allow you to select the region of the image you want to investigate. Then the size of the bin to use and
then launch the calculation.
#### Select your region of interest
Using your mouse, **move** and **resize** the region of interest displayed on top of the preview (left part of the UI).
Any action on this box will update the right plot of the Mean transmission vs file index / TOF or lambda.
<img src='/tutorial/notebooks/dual_energy/images/roi.gif' />
#### Select profile range and size of bins
Using range selection in right plot, select the range you want to work on as well as the size of the bins to use. Those
bins will be displayed inside the range you selected. All images within each bin will be group into one image of average
counts and used in the next step.
<img src='/tutorial/notebooks/dual_energy/images/range_and_bin_selection.gif' />
#### Calculation
As soon as your click the **Calculation** tab, the algorithm will run. This algorithm will use the region of interest
you selected in the left image and will average the counts of this region, then bin all the images within each bin and
calculate the ratio of any combination of bins. The ratio with the difference to 1 will be highlighted in the table and
the corresponding ratio of images displayed in the right side of the **Calculation** tab.
<img src='/tutorial/notebooks/dual_energy/images/calculation.gif' />
Below the master table, you will find a smaller table that summarizes the infos related to the 'optimum' bin ratio such
as the file index used in the bins, TOF range and lambda ranges.
<img src='/tutorial/notebooks/dual_energy/images/summary_table.png' />
#### Export image
You have the possibility to export the current image displayed in the table by clicking the **Export image ...** button
below the image on the right. Simply select the location where you want to create that image and click **OK**. The
image will be, by default, named using the input folder name as prefix and then the bins used to calculate it.
for example: IPTS-25778_normalized_bin18_divided_by_bin9.tif | 42.532468 | 121 | 0.760611 | eng_Latn | 0.997863 |
fa420d74e2f9366f3e5f1c076d946c2696e9c44a | 726 | md | Markdown | README.md | karlhigley/torch-optim-sparse | b6dd32f60f27dbf7707ff1e050e9f193d538345f | [
"MIT"
] | 9 | 2020-09-23T04:18:48.000Z | 2021-12-31T11:40:37.000Z | README.md | karlhigley/torch-optim-sparse | b6dd32f60f27dbf7707ff1e050e9f193d538345f | [
"MIT"
] | null | null | null | README.md | karlhigley/torch-optim-sparse | b6dd32f60f27dbf7707ff1e050e9f193d538345f | [
"MIT"
] | 1 | 2021-12-31T11:40:52.000Z | 2021-12-31T11:40:52.000Z | # torch-sparse-optim
This library implements "sparser" versions of PyTorch optimizers, which only apply momentum and weight decay updates to parameters where the gradients are non-zero.
It contains four optimizers:
- SparserSGD
- SparserAdam
- SparserSGDW
- SparserAdamW
The latter two follow the approaches outlined in ["Decoupled Weight Decay Regularization"](https://arxiv.org/abs/1711.05101) by Loshchilov & Hunter from ICLR 2019.
Except for SGDW, they're all straightforward ports of the existing optimizers from PyTorch, modified only to convert momentum and weight decay to sparse updates. The SGDW optimizer additionally applies a small change to where/how weight decay is applied, as outlined in the paper above.
| 51.857143 | 286 | 0.80854 | eng_Latn | 0.996739 |
fa422dd2a8728204c04236c0c891ad93d1c953fe | 2,187 | md | Markdown | problems/encode-number/README.md | jianpingbadao/leetcode | 0ddead0b7820c05e353e08c551418fc8e318ee96 | [
"MIT"
] | null | null | null | problems/encode-number/README.md | jianpingbadao/leetcode | 0ddead0b7820c05e353e08c551418fc8e318ee96 | [
"MIT"
] | null | null | null | problems/encode-number/README.md | jianpingbadao/leetcode | 0ddead0b7820c05e353e08c551418fc8e318ee96 | [
"MIT"
] | null | null | null | <!--|This file generated by command(leetcode description); DO NOT EDIT. |-->
<!--+----------------------------------------------------------------------+-->
<!--|@author openset <openset.wang@gmail.com> |-->
<!--|@link https://github.com/openset |-->
<!--|@home https://github.com/openset/leetcode |-->
<!--+----------------------------------------------------------------------+-->
[< Previous](https://github.com/openset/leetcode/tree/master/problems/maximum-score-words-formed-by-letters "Maximum Score Words Formed by Letters")
[Next >](https://github.com/openset/leetcode/tree/master/problems/smallest-common-region "Smallest Common Region")
## [1256. Encode Number (Medium)](https://leetcode.com/problems/encode-number "加密数字")
<p>Given a non-negative integer <code>num</code>, Return its <em>encoding</em> string.</p>
<p>The encoding is done by converting the integer to a string using a secret function that you should deduce from the following table:</p>
<p><img alt="" src="https://assets.leetcode.com/uploads/2019/06/21/encode_number.png" style="width: 164px; height: 360px;" /></p>
<p> </p>
<p><strong>Example 1:</strong></p>
<pre>
<strong>Input:</strong> num = 23
<strong>Output:</strong> "1000"
</pre>
<p><strong>Example 2:</strong></p>
<pre>
<strong>Input:</strong> num = 107
<strong>Output:</strong> "101100"
</pre>
<p> </p>
<p><strong>Constraints:</strong></p>
<ul>
<li><code>0 <= num <= 10^9</code></li>
</ul>
### Related Topics
[[Bit Manipulation](https://github.com/openset/leetcode/tree/master/tag/bit-manipulation/README.md)]
[[Math](https://github.com/openset/leetcode/tree/master/tag/math/README.md)]
### Similar Questions
1. [Convert to Base -2](https://github.com/openset/leetcode/tree/master/problems/convert-to-base-2) (Medium)
### Hints
<details>
<summary>Hint 1</summary>
Try to find the number of binary digits returned by the function.
</details>
<details>
<summary>Hint 2</summary>
The pattern is to start counting from zero after determining the number of binary digits.
</details>
| 37.067797 | 148 | 0.619113 | eng_Latn | 0.431526 |
fa42441efd8002719da8f1c04fa3d9ec3e755769 | 2,680 | md | Markdown | docs/AppApi.md | Wirk-io/Wirk-csharp | 715dcda54bdb2459aefb32522bd112d5c1fdf36f | [
"Apache-2.0"
] | null | null | null | docs/AppApi.md | Wirk-io/Wirk-csharp | 715dcda54bdb2459aefb32522bd112d5c1fdf36f | [
"Apache-2.0"
] | null | null | null | docs/AppApi.md | Wirk-io/Wirk-csharp | 715dcda54bdb2459aefb32522bd112d5c1fdf36f | [
"Apache-2.0"
] | null | null | null | # Io.Wirk.Api.Wirk.Api.AppApi
All URIs are relative to *https://api.wirk.io/v1_0*
Method | HTTP request | Description
------------- | ------------- | -------------
[**Get**](AppApi.md#get) | **GET** /App/{id} |
[**GetAll**](AppApi.md#getall) | **GET** /App |
# **Get**
> AppReaderServiceModel Get (int? id)
Get App Id
### Example
```csharp
using System;
using System.Diagnostics;
using Io.Wirk.Api.Wirk.Api;
using Io.Wirk.Api.Wirk.Client;
using Io.Wirk.Api.Wirk.Model;
namespace Example
{
public class GetExample
{
public void main()
{
var apiInstance = new AppApi();
var id = 1; // int? | id
try
{
AppReaderServiceModel result = apiInstance.Get(id);
Debug.WriteLine(result);
}
catch (Exception e)
{
Debug.Print("Exception when calling AppApi.Get: " + e.Message );
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**id** | **int?**| id |
### Return type
[**AppReaderServiceModel**](AppReaderServiceModel.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **GetAll**
> List<AppReaderServiceModel> GetAll ()
Get all apps
### Example
```csharp
using System;
using System.Diagnostics;
using Io.Wirk.Api.Wirk.Api;
using Io.Wirk.Api.Wirk.Client;
using Io.Wirk.Api.Wirk.Model;
namespace Example
{
public class GetAllExample
{
public void main()
{
var apiInstance = new AppApi();
try
{
List<AppReaderServiceModel> result = apiInstance.GetAll();
Debug.WriteLine(result);
}
catch (Exception e)
{
Debug.Print("Exception when calling AppApi.GetAll: " + e.Message );
}
}
}
}
```
### Parameters
This endpoint does not need any parameter.
### Return type
[**List<AppReaderServiceModel>**](AppReaderServiceModel.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
| 21.102362 | 180 | 0.566045 | yue_Hant | 0.537607 |
fa43691782a3dadb3d9918744e48f6aea4a2fee3 | 13,199 | md | Markdown | articles/virtual-desktop/overview.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-desktop/overview.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-desktop/overview.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Vad är Windows Virtual Desktop? – Azure
description: En översikt över Windows Virtual Desktop.
services: virtual-desktop
author: Heidilohr
ms.service: virtual-desktop
ms.topic: overview
ms.date: 04/10/2020
ms.author: helohr
manager: lizross
ms.openlocfilehash: 927696d029bf1b8742dc0001e03799322f368191
ms.sourcegitcommit: 8dc84e8b04390f39a3c11e9b0eaf3264861fcafc
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 04/13/2020
ms.locfileid: "81261728"
---
# <a name="what-is-windows-virtual-desktop"></a>Vad är Windows Virtual Desktop?
Windows Virtual Desktop är en skrivbords- och appvirtualiseringstjänst som körs i molnet.
Så här kan du göra när du kör Windows Virtual Desktop på Azure:
* Konfigurera en Windows 10-distribution med flera sessions som ger ett fullständigt Windows 10 med skalbarhet
* Virtualisera Office 365 ProPlus och optimera det så att det körs i virtuella scenarier med flera användare
* Ge virtuella Windows 7-skrivbord kostnadsfria utökade säkerhetsuppdateringar
* Ta med dina befintliga rds-datorer (Remote Desktop Services) och Windows Server till valfri dator
* Virtualisera både skrivbord och appar
* Hantera stationära datorer och appar för Windows 10, Windows Server och Windows 7 med en enhetlig hanteringsupplevelse
## <a name="introductory-video"></a>Introduktionsvideo
Lär dig mer om Windows Virtual Desktop, varför det är unikt och vad som är nytt i den här videon:
<br></br><iframe src="https://www.youtube.com/embed/NQFtI3JLtaU" width="640" height="320" allowFullScreen="true" frameBorder="0"></iframe>
Mer information om Windows Virtual Desktop finns i [vår spellista](https://www.youtube.com/watch?v=NQFtI3JLtaU&list=PLXtHYVsvn_b8KAKw44YUpghpD6lg-EHev).
## <a name="key-capabilities"></a>De viktigaste funktionerna
Med Windows Virtual Desktop kan du konfigurera en skalbar och flexibel miljö:
* Skapa en fullständig virtualiseringsmiljö för skrivbordet i din Azure-prenumeration utan att behöva köra några ytterligare gatewayservrar.
* Publicera så många värdpooler som du behöver för att hantera dina olika arbetsbelastningar.
* Ta med din egen avbildning för produktionsarbetsbelastningar eller testa från Azure Gallery.
* Minska kostnaderna med poolade resurser med flera sessioner. Med den nya multisessionsfunktionen för Windows 10 Enterprise exklusivt för windows virtual desktop och remote desktop session host (RDSH) roll på Windows Server kan du kraftigt minska antalet virtuella datorer och operativsystem (OS) omkostnader samtidigt som du tillhandahåller samma resurser till användarna.
* Ge individuellt ägande via personliga (beständiga) skrivbord.
Du kan distribuera och hantera virtuella skrivbord:
* Använd Windows Virtual Desktop PowerShell- och REST-gränssnitt för att konfigurera värdpoolerna, skapa appgrupper, tilldela användare och publicera resurser.
* Publicera fullständiga skrivbordsappar eller enskilda fjärrappar från en enda värdpool, skapa enskilda appgrupper för olika användargrupper eller tilldela till och med användare till flera appgrupper för att minska antalet bilder.
* När du hanterar din miljö använder du inbyggd delegerad åtkomst för att tilldela roller och samla in diagnostik för att förstå olika konfigurations- eller användarfel.
* Använd den nya diagnostiktjänsten för att felsöka fel.
* Hantera endast avbildningen och virtuella datorer, inte infrastrukturen. Du behöver inte hantera fjärrskrivbordsrollerna som du gör med Fjärrskrivbordstjänster, bara de virtuella datorerna i din Azure-prenumeration.
Du kan också tilldela och ansluta användare till dina virtuella skrivbord:
* När användarna har tilldelats kan de starta alla Windows Virtual Desktop-klienter för att ansluta användare till sina publicerade Windows-datorer och -program. Anslut från valfri enhet via antingen ett inbyggt program på enheten eller windows virtual desktop HTML5-webbklienten.
* Upprätta användare på ett säkert sätt via omvända anslutningar till tjänsten, så du behöver aldrig lämna några inkommande portar öppna.
## <a name="requirements"></a>Krav
Det finns några saker du behöver för att konfigurera Windows Virtual Desktop och ansluta användarna till sina Windows-datorer och -program.
Vi planerar att lägga till stöd för följande OSes, så se till att du har [rätt licenser](https://azure.microsoft.com/pricing/details/virtual-desktop/) för dina användare baserat på skrivbordet och appar som du planerar att distribuera:
|Operativsystem|Obligatorisk licens|
|---|---|
|Windows 10 Enterprise multi-session eller Windows 10 Enterprise|Microsoft 365 E3, E5, A3, A5, F1, Företag<br>Windows E3, E5, A3, A5|
|Windows 7 Enterprise |Microsoft 365 E3, E5, A3, A5, F1, Företag<br>Windows E3, E5, A3, A5|
|Windows Server 2012 R2, 2016, 2019|RDS-klientåtkomstlicens (CAL) med Software Assurance|
Din infrastruktur behöver följande saker för att stödja Windows Virtual Desktop:
* En [Azure Active Directory](/azure/active-directory/)
* En Active Directory för Windows Server synkroniseras med Azure Active Directory. Du kan konfigurera detta med något av följande:
* Azure AD Connect (för hybridorganisationer)
* Azure AD Domain Services (för hybrid- eller molnorganisationer)
* En Azure-prenumeration som innehåller ett virtuellt nätverk som antingen innehåller eller är ansluten till Active Directory för Windows Server
De virtuella Azure-datorer som du skapar för Windows Virtual Desktop måste vara:
* [Standarddomänansluten](../active-directory-domain-services/active-directory-ds-comparison.md) eller [Hybrid AD-anslutna](../active-directory/devices/hybrid-azuread-join-plan.md). Virtuella datorer kan inte Azure AD-anslutna.
* Köra en av följande [OS-bilder som stöds](#supported-virtual-machine-os-images).
>[!NOTE]
>Om du behöver en Azure-prenumeration kan du [registrera dig för en kostnadsfri utvärderingsversion](https://azure.microsoft.com/free/)på en månad . Om du använder den kostnadsfria utvärderingsversionen av Azure bör du använda Azure AD Domain Services för att hålla Din Active Directory i Windows Server synkroniserad med Azure Active Directory.
De virtuella Azure-datorer som du skapar för Windows Virtual Desktop måste ha åtkomst till följande webbadresser:
|Adress|Utgående TCP-port|Syfte|Service Tag|
|---|---|---|---|
|*.wvd.microsoft.com|443|Trafikerar tjänste-|WindowsVirtualDesktop|
|mrsglobalsteus2prod.blob.core.windows.net|443|Agent- och SXS-stackuppdateringar|AzureCloud|
|*.core.windows.net|443|Agent trafik|AzureCloud|
|*.servicebus.windows.net|443|Agent trafik|AzureCloud|
|prod.warmpath.msftcloudes.com|443|Agent trafik|AzureCloud|
|catalogartifact.azureedge.net|443|Azure Marketplace|AzureCloud|
|kms.core.windows.net|1688|Windows-aktivering|Internet|
>[!IMPORTANT]
>Vi rekommenderar att du använder tjänsttaggarna i stället för webbadresser i de flesta fall för att förhindra serviceproblem. Att avblockera dessa webbadresser är viktigt för en tillförlitlig Windows Virtual Desktop-distribution. Att blockera åtkomst till dessa url:er stöds inte och påverkar tjänstens funktioner. Dessa url:er motsvarar bara Windows virtuella skrivbordsplatser och resurser och innehåller inte webbadresser för andra tjänster som Azure Active Directory.
I följande tabell visas valfria url:er som dina virtuella Azure-datorer kan ha åtkomst till:
|Adress|Utgående TCP-port|Syfte|Service Tag|
|---|---|---|---|
|*.microsoftonline.com|443|Autentisering till MS Online Services|Inget|
|*.events.data.microsoft.com|443|Telemetritjänst|Inget|
|www.msftconnecttest.com|443|Identifierar om operativsystemet är anslutet till internet|Inget|
|*.prod.do.dsp.mp.microsoft.com|443|Windows Update|Inget|
|login.windows.net|443|Logga in på MS Online Services, Office 365|Inget|
|*.sfx.ms|443|Uppdateringar för OneDrive-klientprogram|Inget|
|*.digicert.com|443|Kontroll av återkallade certifikat|Inget|
>[!NOTE]
>Windows Virtual Desktop har för närvarande ingen lista över IP-adressintervall som du kan vitlista för att tillåta nätverkstrafik. Vi stöder bara vitlistning av specifika webbadresser just nu.
>
>En lista över Office-relaterade URL:er, inklusive nödvändiga Azure Active Directory-relaterade URL:er, finns [i Url:er och IP-adressintervall för Office 365](/office365/enterprise/urls-and-ip-address-ranges).
>
>Du måste använda jokertecknet (*) för webbadresser som involverar servicetrafik. Om du föredrar att inte använda * för agentrelaterad trafik gör du så här för att hitta webbadresserna utan jokertecken:
>
>1. Registrera dina virtuella datorer i värdpoolen för Virtuellt skrivbord i Windows.
>2. Öppna **Loggboken** och navigera till **Windows loggar** > **Program** > **WVD-Agent** och leta efter händelse-ID 3702.
>3. Vitlista webbadresserna som du hittar under händelse-ID 3702. Url:erna under händelse-ID 3702 är regionspecifika. Du måste upprepa vitlistningsprocessen med relevanta webbadresser för varje region som du vill distribuera dina virtuella datorer i.
Windows Virtual Desktop består av de Windows-datorer och -appar som du levererar till användare och hanteringslösningen, som är värd för som en tjänst på Azure av Microsoft. Stationära datorer och appar kan distribueras på virtuella datorer i alla Azure-regioner, och hanteringslösningen och data för dessa virtuella datorer finns i USA. Detta kan resultera i dataöverföring till USA.
För bästa prestanda, se till att nätverket uppfyller följande krav:
* Svarstid för rundresa (RTT) från klientens nätverk till Azure-regionen där värdpooler har distribuerats bör vara mindre än 150 ms.
* Nätverkstrafik kan flöda utanför lands-/regiongränser när virtuella datorer som är värdar för stationära datorer och appar ansluter till hanteringstjänsten.
* För att optimera för nätverksprestanda rekommenderar vi att sessionsvärdens virtuella datorer samlokaliserades i samma Azure-region som hanteringstjänsten.
## <a name="supported-remote-desktop-clients"></a>Klienter som stöds av fjärrskrivbord
Följande klienter för fjärrskrivbord stöder Windows Virtual Desktop:
* [Windows-skrivbordet](connect-windows-7-and-10.md)
* [Webb](connect-web.md)
* [macOS](connect-macos.md)
* [iOS](connect-ios.md)
* [Android (förhandsgranskning)](connect-android.md)
> [!IMPORTANT]
> Windows Virtual Desktop stöder inte RADC-klienten (RemoteApp and Desktop Connections) eller MSTSC-klienten (Remote Desktop Connection).
> [!IMPORTANT]
> Windows Virtual Desktop stöder för närvarande inte klienten för fjärrskrivbord från Windows Store. Stöd för den här klienten kommer att läggas till i en framtida version.
Fjärrskrivbordsklienterna måste ha åtkomst till följande webbadresser:
|Adress|Utgående TCP-port|Syfte|Klient(er)|
|---|---|---|---|
|*.wvd.microsoft.com|443|Trafikerar tjänste-|Alla|
|*.servicebus.windows.net|443|Felsöka data|Alla|
|go.microsoft.com|443|Microsoft FWLinks|Alla|
|aka.ms|443|Microsoft URL förkortare|Alla|
|docs.microsoft.com|443|Dokumentation|Alla|
|privacy.microsoft.com|443|Sekretesspolicy|Alla|
|query.prod.cms.rt.microsoft.com|443|Klientuppdateringar|Windows-skrivbordet|
>[!IMPORTANT]
>Att öppna dessa webbadresser är viktigt för en tillförlitlig klientupplevelse. Att blockera åtkomst till dessa url:er stöds inte och påverkar tjänstens funktioner. Dessa URL:er motsvarar bara klientplatser och resurser och innehåller inte webbadresser för andra tjänster som Azure Active Directory.
## <a name="supported-virtual-machine-os-images"></a>Os-avbildningar som stöds av virtuella datorer
Windows Virtual Desktop stöder följande x64-operativsystemavbildningar:
* Windows 10 Enterprise multi-session, version 1809 eller senare
* Windows 10 Enterprise, version 1809 eller senare
* Windows 7 Enterprise
* Windows Server 2019
* Windows Server 2016
* Windows Server 2012 R2
Windows Virtual Desktop stöder inte x86 (32-bitars), Windows 10 Enterprise N eller Windows 10 Enterprise KN operativsystemavbildningar. Windows 7 stöder inte heller några VHD- eller VHDX-baserade profillösningar som finns på hanterad Azure Storage på grund av en sektorstorleksbegränsning.
Tillgängliga automatiserings- och distributionsalternativ beror på vilket operativsystem och vilken version du väljer, som visas i följande tabell:
|Operativsystem|Azure-avbildningsgalleri|Manuell vm-distribution|Integrering av Azure Resource Manager-mall|Etableringsvärdpooler på Azure Marketplace|Uppdateringar av Windows Virtual Desktop Agent|
|--------------------------------------|:------:|:------:|:------:|:------:|:------:|
|Windows 10 flera session, version 1903|Ja|Ja|Ja|Ja|Automatisk|
|Windows 10 flera session, version 1809|Ja|Ja|Inga|Inga|Automatisk|
|Windows 10 Enterprise, version 1903|Ja|Ja|Ja|Ja|Automatisk|
|Windows 10 Enterprise, version 1809|Ja|Ja|Inga|Inga|Automatisk|
|Windows 7 Enterprise|Ja|Ja|Inga|Inga|Manuell|
|Windows Server 2019|Ja|Ja|Inga|Inga|Automatisk|
|Windows Server 2016|Ja|Ja|Ja|Ja|Automatisk|
|Windows Server 2012 R2|Ja|Ja|Inga|Inga|Automatisk|
## <a name="next-steps"></a>Nästa steg
För att komma igång måste du skapa en klient. Om du vill veta mer om hur du skapar en klient fortsätter du till självstudien för att skapa klient.
> [!div class="nextstepaction"]
> [Skapa en klientorganisation i Windows Virtual Desktop](tenant-setup-azure-active-directory.md)
| 65.995 | 472 | 0.804076 | swe_Latn | 0.99872 |
fa437c56fb0990da63ee27ab125c7a237681106d | 2,635 | md | Markdown | src/pages/blog/gitui.md | mimiyangc/waylonwalkerv2 | 85e3fbe32470210f26b4317a299770bfd1e91ff9 | [
"MIT"
] | 4 | 2019-09-27T03:24:25.000Z | 2020-10-17T02:27:02.000Z | src/pages/blog/gitui.md | mimiyangc/waylonwalkerv2 | 85e3fbe32470210f26b4317a299770bfd1e91ff9 | [
"MIT"
] | 36 | 2020-02-09T19:06:42.000Z | 2022-02-26T17:28:46.000Z | src/pages/blog/gitui.md | mimiyangc/waylonwalkerv2 | 85e3fbe32470210f26b4317a299770bfd1e91ff9 | [
"MIT"
] | 12 | 2020-06-30T10:16:11.000Z | 2022-01-10T17:13:39.000Z | ---
templateKey: blog-post
tags: ['git']
title: Gitui is a blazing fast terminal git interface
date: 2021-01-17T00:00:00
status: published
---
Gitui is a terminal-based git user interface (TUI) that will change the way
that you work with git. I have been a long-time user of the git cli, and it's
been hard to beat, mostly because there is nothing that keeps my fingers on the
keyboard quite like it, except `gitui` which comes with some great ways to very
quickly walk through a git project.
## installation
Go to their [releases]https://github.com/extrawurst/gitui/releases) page,
download the latest build, and pop it on your PATH. I have the following
stuffed away in some install scripts to get the latest version.
_<small>install latest release</small>_
``` bash
GITUI_VERSION=$(curl --silent https://github.com/extrawurst/gitui/releases/latest | tr -d '"' | sed 's/^.*tag\///g' | sed 's/>.*$//g' | sed 's/^v//')
wget https://github.com/extrawurst/gitui/releases/download/v${GITUI_VERSION}/gitui-linux-musl.tar.gz -O- -q | sudo tar -zxf - -C /usr/bin/
```
## run `gitui`
It opens blazing fast.
``` bash
gitui
```
## Quick Commits
Sometimes I edit a number of files and want to commit them one at a time, this
is painful in the git cli and my main use case for `gitui`. `gitui` shows
unstaged changes at the top, staged changes on the bottom, and a diff on the
right.

## Navigate with hjkl
By default, `gitui` uses arrow keys, but simply copying
[vim_style_key_config.ron](https://github.com/extrawurst/gitui/blob/master/assets/vim_style_key_config.ron)
to your config directory will get you vim-like keybindings.
## workflow
Generally, I pop open `gitui`, use j/k to get to the file I want to commit,
glance at the diff to the right, press enter to stage the file, sc to switch
focus to the saged files and commit, write my commit message hit enter and
done.
* w/s: to toggle focus between working and staged changes
* j/k: to scroll each section
* h/l: switch between left and right side
* enter: toggle file from working or staging
* c: start a commit message
* p: push
* <c-c>: quit
## Other Panes
I am in the `Status [1]` pane 90% of the time, but it also has three other
panes for `Log [2]`, `Stashing [3]`, and `Stashes [4]`. I do not really use
the stashes panes, but the `Log [2]` pane is quite useful to quickly go through
the last set of commits and see the diff for each of them.
## What UI do you use for git
Let me know what ui you use for git, do you stick to the cli, use a gui, or use
a similar `TUI` interface?
| 32.134146 | 149 | 0.724858 | eng_Latn | 0.997023 |
fa43c577cdad2ff7a5c546a0310c0c38e196d0d3 | 1,867 | md | Markdown | src/blog/leetcode-univalued-binary-tree.md | ogdenstudios/ogdenstudios.xyz | 2eec044c9c2aec0ce87dc47b37dd4e1dab391b61 | [
"MIT"
] | null | null | null | src/blog/leetcode-univalued-binary-tree.md | ogdenstudios/ogdenstudios.xyz | 2eec044c9c2aec0ce87dc47b37dd4e1dab391b61 | [
"MIT"
] | 2 | 2017-07-28T20:00:52.000Z | 2017-12-22T05:35:45.000Z | src/blog/leetcode-univalued-binary-tree.md | ogdenstudios/ogdenstudios.xyz | 2eec044c9c2aec0ce87dc47b37dd4e1dab391b61 | [
"MIT"
] | 1 | 2020-10-12T22:42:59.000Z | 2020-10-12T22:42:59.000Z | ---
layout: post
title: "Leetcode problem: Univalued Binary Tree"
tags: [leetcode, 'post']
description: "An easy-level leetcode problem"
date: 2019-12-17
---
## Problem description
[Problem on leetcode](https://leetcode.com/problems/univalued-binary-tree/)
A binary tree is univalued if every node in the tree has the same value.
Return true if and only if the given tree is univalued.
## Intuition
We've got to visit all the nodes of this tree and make sure they're the same value. So I'm figuring I can accomplish this with [depth-first search](https://en.wikipedia.org/wiki/Depth-first_search) or [breadth-first search](https://en.wikipedia.org/wiki/Breadth-first_search). Either will get the job done.
I'll set the first root value as my value check, since it's easy to grab that to start.
Then I can implement DFS or BFS. At each node, I'll check if the value matches the root node's value. If it doesn't, I can return false. If it does, I'll keep on going with the algorithm.
If the algorithm completes and I haven't returned false, I'll return true.
Here's how I expressed that in JavaScript:
```
var isUnivalTree = function(root) {
let val = root.val;
let response = true;
dfs(root);
return response;
function dfs(node) {
if (node.val !== val) {
response = false;
return;
}
if (node.left) {
dfs(node.left);
}
if (node.right) {
dfs(node.right);
}
}
};
```
I chose DFS purely because it's been a few weeks since I last coded it up and wanted aother refresher. Also note I'm using [hoisting](https://scotch.io/tutorials/understanding-hoisting-in-javascript#toc-hoisting-functions) on the function - so I can call `dfs()` above where I've defined it. On my first submission I forgot to actually call it and failed. Classic. | 35.903846 | 364 | 0.691484 | eng_Latn | 0.976853 |
fa449e0764e234ed967b43f216385230d09e06f2 | 3,556 | md | Markdown | README.md | juliacortez/geeta-backend | 2fee54f5a8b34c61c39f9191c56ec86cf09c22b1 | [
"MIT"
] | null | null | null | README.md | juliacortez/geeta-backend | 2fee54f5a8b34c61c39f9191c56ec86cf09c22b1 | [
"MIT"
] | null | null | null | README.md | juliacortez/geeta-backend | 2fee54f5a8b34c61c39f9191c56ec86cf09c22b1 | [
"MIT"
] | null | null | null | <p align="center">
<img src="https://i.ibb.co/h2vNyP2/Novo-Projeto-30.png" width="800px" alt="logo">
</p>
<p align="center">
<a href="https://github.com/juliacortez/geeta-backend/blob/main/LICENSE" target="_blank"><img src="https://img.shields.io/github/license/juliacortez/movies-app?color=blue&style=for-the-badge"></a>
</p>
<h2>⚛️ᅠSobre</h2>
Projeto backend realizado durante o curso web fullstack da <a href="https://www.labenu.com.br/">Labenu</a>. API criada com a finalidade de controlar os produtos da loja de roupas online fictícia, Geeta. Utilizando esta API é possível criar um novo usuário os diferenciando por seus papéis, cliente ou administrador. As senhas cadastradas são criptografadas e então guardadas no banco de dados. Um token é gerado quando o usuário é criado e ao realizar login. Utilizando o token de administrador pode-se criar novos tamanhos, tags e produtos no banco de dados, além de deletar essas informações de acordo com o ID, que é gerado automaticamente ao se cadastrar uma nova informação. Também estão disponíveis requisições para retornar os produtos, tags e tamanhos cadastrados no banco de dados.
<h2>🔗 Documentação</h2>
<a href="https://documenter.getpostman.com/view/17588272/UVyuTvDu">Geeta Backend</a>
<h2>🛠️ Tecnologias</h2>
<li><a href="https://nodejs.org/en/">Node.js</a> (v14.17.4)</li>
<li><a href="https://www.typescriptlang.org/">Typescript</a> (v4.6.3)</li>
<li><a href="http://expressjs.com/pt-br/">Express</a> (v4.17.3)</li>
<li><a href="https://knexjs.org/">Knex</a> (v1.0.4)</li>
<li><a href="https://www.npmjs.com/package/bcrypt">Bcryptjs</a> (v2.4.3)</li>
<li><a href="https://www.npmjs.com/package/cors">Cors</a> (v2.8.5)</li>
<li><a href="https://www.npmjs.com/package/jsonwebtoken">Jsonwebtoken</a> (v8.5.1)</li>
<li><a href="https://npm.io/package/uuid">UUID</a> (v8.3.2)</li>
<li><a href="https://www.npmjs.com/package/mysql">MySQL</a> (v2.18.1)</li>
<h2>💻 Pré-requisitos</h2>
Antes de começar você vai precisar ter instalado em sua máquina as seguintes ferramentas:
<a href="https://git-scm.com">Git</a>, <a href="https://nodejs.org.en/">Node.js</a>.
Além disto ter um editor para trabalhar com o código, para o desenvolvimento deste projeto foi utilizado o <a href="https://code.visualstudio.com/">VSCode</a>.
<h2>🚀 Rodando o projeto</h2>
```bash
# Clone este repositório
# HTTPS
$ git clone https://github.com/juliacortez/geeta-backend.git
# CLI
$ gh repo clone gh repo clone juliacortez/geeta-backend
# Acesse a pasta do projeto no terminal/cmd
$ cd geeta-backend
# Instale as dependências
$ npm install
# Execute o comando para iniciar a aplicação
$ npm run start
# O servidor iniciará na porta:3003
# Se necessário execute o comando para criar as tabelas
$ npm run migrations
```
<h2>👩🏻💻 Desenvolvedora</h2>
<div><a href="https://github.com/juliacortez">
<img style="border-radius: 50%;" src="https://media-exp1.licdn.com/dms/image/C5603AQFLn8A145Rfww/profile-displayphoto-shrink_800_800/0/1635911104301?e=1653523200&v=beta&t=E3V1eTckX1gq0-7eq5AfRaumATFbuLsufB8lHpNa4zk" width="150px" alt="Julia Cortez">
<br />
<sub><b>Julia Cortez</sub></b></a>
<br />
Entre em contato!<br />
<a href="https://www.linkedin.com/in/juliacortez-98/" target="_blank"><img src="https://img.shields.io/badge/LinkedIn-0077B5?style=for-the-badge&logo=linkedin&logoColor=white" target="_blank"></a>
<a href="mailto:juliacortez984@gmail.com"><img src="https://img.shields.io/badge/Gmail-D14836?style=for-the-badge&logo=gmail&logoColor=white" target="_blank"></a>
</div>
| 53.074627 | 791 | 0.726378 | por_Latn | 0.909704 |
fa44e963647fd876a8a75bb3422665fc3aa2f713 | 1,181 | md | Markdown | _posts/2017-11-03-Romantica-of-Devon-Oceana-2014.md | celermarryious/celermarryious.github.io | bcf6ff5049c82e276226a68ba269c11ccca7f970 | [
"MIT"
] | null | null | null | _posts/2017-11-03-Romantica-of-Devon-Oceana-2014.md | celermarryious/celermarryious.github.io | bcf6ff5049c82e276226a68ba269c11ccca7f970 | [
"MIT"
] | null | null | null | _posts/2017-11-03-Romantica-of-Devon-Oceana-2014.md | celermarryious/celermarryious.github.io | bcf6ff5049c82e276226a68ba269c11ccca7f970 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2017-11-03
title: "Romantica of Devon Oceana 2014"
category: Romantica of Devon
tags: [Romantica of Devon,2014]
---
### Romantica of Devon Oceana
Just **$329.99**
### 2014
<table><tr><td>BRANDS</td><td>Romantica of Devon</td></tr><tr><td>Years</td><td>2014</td></tr></table>
<a href="https://www.readybrides.com/en/romantica-of-devon/66645-romantica-of-devon-oceana.html"><img src="//img.readybrides.com/154376/romantica-of-devon-oceana.jpg" alt="Romantica of Devon Oceana" style="width:100%;" /></a>
<!-- break --><a href="https://www.readybrides.com/en/romantica-of-devon/66645-romantica-of-devon-oceana.html"><img src="//img.readybrides.com/154378/romantica-of-devon-oceana.jpg" alt="Romantica of Devon Oceana" style="width:100%;" /></a>
<a href="https://www.readybrides.com/en/romantica-of-devon/66645-romantica-of-devon-oceana.html"><img src="//img.readybrides.com/154374/romantica-of-devon-oceana.jpg" alt="Romantica of Devon Oceana" style="width:100%;" /></a>
Buy it: [https://www.readybrides.com/en/romantica-of-devon/66645-romantica-of-devon-oceana.html](https://www.readybrides.com/en/romantica-of-devon/66645-romantica-of-devon-oceana.html)
| 69.470588 | 239 | 0.728196 | yue_Hant | 0.471821 |
fa44ee36aca66b204d85c699c7e0ce051cda5917 | 351 | md | Markdown | collections/_publications/2011-GatzhammerMehlNeckel.md | JurgenKersschot/precice.github.io | 32e61fe70cde8fd3cec4329b4bfa4c3c9f219f73 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | collections/_publications/2011-GatzhammerMehlNeckel.md | JurgenKersschot/precice.github.io | 32e61fe70cde8fd3cec4329b4bfa4c3c9f219f73 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | collections/_publications/2011-GatzhammerMehlNeckel.md | JurgenKersschot/precice.github.io | 32e61fe70cde8fd3cec4329b4bfa4c3c9f219f73 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | ---
title: "Partitioned Fluid-Structure Interaction Coupling Methods for Black-Box Solvers"
pub-url:
year: 2011
authors: "Bernhard Gatzhammer, Miriam Mehl, Tobias Neckel"
---
In M. Behr, J. Lang, E. Rank and M. Schäfer (ed.), 2nd International Conference on Computational Engineering (ICCE 2011), p. 76–77. typographics GmbH, Darmstadt, October 2011
| 43.875 | 174 | 0.757835 | eng_Latn | 0.443409 |
fa454fd4d5493534e924d1a690136985317811b8 | 370 | md | Markdown | Chapter14/SQLServer/README.md | rajadileepkolli/Up-and-Running-with-jOOQ | 5004c61718e8c7809d5f5319dd35a875793ecc69 | [
"MIT"
] | null | null | null | Chapter14/SQLServer/README.md | rajadileepkolli/Up-and-Running-with-jOOQ | 5004c61718e8c7809d5f5319dd35a875793ecc69 | [
"MIT"
] | null | null | null | Chapter14/SQLServer/README.md | rajadileepkolli/Up-and-Running-with-jOOQ | 5004c61718e8c7809d5f5319dd35a875793ecc69 | [
"MIT"
] | null | null | null | # Chapter 14 - Derived Tables, CTE, and Views
Derived Tables, CTE (Common Table Expression), and Views are important players in SQL context; they're useful to organize and optimize the reuse of long and complex queries (typically, base queries and/or expensive queries (in performance terms)), and to improve readability by breaking down the code into separate steps.
| 92.5 | 322 | 0.791892 | eng_Latn | 0.997667 |
fa4550ca77b2505b274f247474c508d54914f657 | 6,806 | md | Markdown | docs/2014/analysis-services/server-configuration-utility-data-mining-add-ins-for-excel.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/server-configuration-utility-data-mining-add-ins-for-excel.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/server-configuration-utility-data-mining-add-ins-for-excel.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Средство настройки сервера (надстройки интеллектуального анализа для Excel данных) | Документация Майкрософт
ms.custom: ''
ms.date: 04/27/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.suite: ''
ms.technology:
- analysis-services
ms.tgt_pltfrm: ''
ms.topic: conceptual
ms.assetid: 28435f86-5cec-4a1e-9b7d-b2069c1ddddb
caps.latest.revision: 6
author: minewiskan
ms.author: owend
manager: craigg
ms.openlocfilehash: ae026e24210385a24b53f9ddf0cf1533a0e27e40
ms.sourcegitcommit: c18fadce27f330e1d4f36549414e5c84ba2f46c2
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 07/02/2018
ms.locfileid: "37178931"
---
# <a name="server-configuration-utility-data-mining-add-ins-for-excel"></a>Средство настройки сервера (надстройки интеллектуального анализа данных для Excel)
При установке надстроек интеллектуального анализа данных для Excel средство настройки сервера также устанавливается и запускается при первом открытии надстроек. В этом разделе рассматривается использование этой программы для подключения к экземпляру служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)] и настройки базы данных для работы с моделями интеллектуального анализа данных.
## <a name="bkmk_step1"></a> Шаг 1: Подключение к службам Analysis Services
Выберите сервер служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)], который будет предоставлять алгоритмы и сохранять модели интеллектуального анализа данных.
При создании соединения для интеллектуального анализа данных необходимо выбрать сервер, где можно поэкспериментировать с моделями интеллектуального анализа данных. Рекомендуется создать новую базу данных на сервере и выделить ее для интеллектуального анализа данных либо попросить администратора подготовить сервер интеллектуального анализа данных. Это позволит строить модели, не оказывая влияния на производительность других служб.
Обратите внимание, что сервер служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)], используемый для интеллектуального анализа данных, не должен обязательно находиться на одном сервере с исходными данными.
**Имя сервера**
Введите имя сервера с установленным экземпляром служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)], который будет использоваться для интеллектуального анализа данных.
**Проверка подлинности**
Укажите методы проверки подлинности. Если администратор еще не настроил доступ к серверу с помощью HTTPPump, то для соединения со службами [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)] требуется проверка подлинности Windows.
## <a name="bkmk_step2"></a> Шаг 2: Включение временных моделей
Перед использованием надстроек необходимо изменить свойство сервера служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)] так, чтобы оно разрешало создание временных моделей интеллектуального анализа данных.
Временных моделей интеллектуального анализа, также называются *модели сеанса*. Это связано с тем, что эти модели сохраняются, только пока открыт текущий сеанс. Как только соединение с сервером закрывается, сеанс заканчивается и все модели, использовавшиеся во время сеанса, удаляются.
Использование моделей интеллектуального анализа данных сеанса не влияет на объем памяти на сервере, но нагрузка, вызванная созданием моделей, может повлиять на производительность сервера.
Вначале мастер определяет заданные на сервере параметры. Если на сервере уже разрешено использование временных моделей интеллектуального анализа, вы можете щелкнуть **Далее** для продолжения. Мастер также показывает инструкции для включения временных моделей интеллектуального анализа данных на указанном сервере служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)] или для выполнения запроса системному администратору служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)].
## <a name="bkmk_step3"></a> Шаг 3: Создание базы данных для пользователей надстроек
На этой странице мастера установки и настройки можно создать новую базу данных, которая будет предназначена для интеллектуального анализа данных, или выбрать существующую базу данных [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)].
> [!WARNING]
> Для интеллектуального анализа данных требуется экземпляр служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)], работающий в многомерном режиме. Табличный режим не поддерживает интеллектуальный анализ данных.
Рекомендуется создать отдельную базу данных, специально предназначенную для интеллектуального анализа данных. Это позволит экспериментировать с моделями интеллектуального анализа данных, не оказывая влияния на другие объекты решения.
Если выбрана существующая база данных на экземпляре служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)], убедитесь, что можно перезаписать существующие модели, если окажется, что при создании модели с помощью надстроек модель с таким именем уже существует.
**Создание новой базы данных**
Выберите этот параметр для создания новой базы данных на заданном сервере. В базе данных интеллектуального анализа данных будут храниться источники данных, структуры и модели интеллектуального анализа данных.
**Имя базы данных**
Введите имя новой базы данных. Если имя не используется, оно будет создано.
**Использовать существующую базу данных**
Выберите этот параметр для использования существующей базы данных служб [!INCLUDE[ssASnoversion](../includes/ssasnoversion-md.md)].
**База данных**
Если используется существующая база данных, следует выбрать ее имя из списка.
## <a name="bkmk_step4"></a> Шаг 4: Предоставление пользователям надстроек соответствующих разрешений
Следует убедиться, что все пользователи, включая пользователей надстроек, имеют необходимые разрешения на просмотр, изменение, обработку или создание структур и моделей интеллектуального анализа данных.
По умолчанию для использования надстроек требуется встроенная проверка подлинности Windows.
**Предоставить пользователям надстроек административные разрешения**
Выберите этот параметр для предоставления заданным пользователям административного доступа к базе данных.
> [!NOTE]
> Эти разрешения применяются только к базе данных, указанной в **имя базы данных** поле.
**Имя базы данных**
Отображает имя выбранной базы данных.
**Укажите пользователей или группы для добавления**
Выберите имена входа, которые будут иметь доступ к базе данных, используемой для интеллектуального анализа данных.
**Добавить**
Нажмите, чтобы открыть диалоговое окно и добавить пользователей или группы.
**Удалить**
Нажмите, чтобы удалить выбранного пользователя или выбранную группу из списка пользователей с административными разрешениями.
| 70.164948 | 499 | 0.80238 | rus_Cyrl | 0.971289 |
fa45cd6bd40be8f4ba4efb1825d15d8cea0f0640 | 1,327 | md | Markdown | content/author/flinn/_index.md | CJCascalheira/cjcascalheira | 30012ba64fba765690d866154e3b812293024ce9 | [
"MIT"
] | null | null | null | content/author/flinn/_index.md | CJCascalheira/cjcascalheira | 30012ba64fba765690d866154e3b812293024ce9 | [
"MIT"
] | null | null | null | content/author/flinn/_index.md | CJCascalheira/cjcascalheira | 30012ba64fba765690d866154e3b812293024ce9 | [
"MIT"
] | null | null | null | ---
authors:
- flinn
bio: Research interests include mental health help-seeking, the experiences of sexual and gender minority people in healthcare settings, and trauma-sensitive/trauma-informed practice. Clinical experience in community, college, primary care, and private practice settings.
education:
courses:
- course: M.A. in Clinical Psychology
institution: University of Detroit Mercy
year: 2016
email: flinn@nmsu.edu
interests:
- Mental Health Help-Seeking
- LGBTQ+ in Healthcare
- Trauma-Sensitive/Trauma-Informed Practice
name: Ryan E. Flinn
organizations:
- name: New Mexico State University
url: https://cep.nmsu.edu/
role: Doctoral Researcher in Counseling Psychology
social:
- icon: envelope
icon_pack: fas
link: mailto:flinn@nmsu.edu
superuser: false
user_groups:
- Researchers
---
Ryan E. Flinn, M.A. (he/him) is a counseling psychology doctoral candidate at New Mexico State University.
He received his master’s degree in clinical psychology from the University of Detroit Mercy and has worked as a therapist in community, college, primary care, and private practice settings for five years. Ryan’s scholarly work focuses on mental health help-seeking, the experiences of sexual and gender minority people
in healthcare settings, and trauma-sensitive/trauma-informed practice. | 33.175 | 318 | 0.784476 | eng_Latn | 0.978755 |
fa4698b6d5afbf9c1dc505b0bc98a0fb84282790 | 3,016 | md | Markdown | content_development/15_MAIN_Open_Access_and_discoverability.md | tosteiner/Module-6-Open-Access-to-Research-Papers | 47db3fdd46fa538a8362c95c3555321edc341c9a | [
"CC0-1.0"
] | 13 | 2018-10-24T17:04:23.000Z | 2021-07-15T18:30:22.000Z | content_development/15_MAIN_Open_Access_and_discoverability.md | tosteiner/Module-6-Open-Access-to-Research-Papers | 47db3fdd46fa538a8362c95c3555321edc341c9a | [
"CC0-1.0"
] | 20 | 2018-08-07T05:34:03.000Z | 2019-11-04T15:45:29.000Z | content_development/15_MAIN_Open_Access_and_discoverability.md | tosteiner/Module-6-Open-Access-to-Research-Papers | 47db3fdd46fa538a8362c95c3555321edc341c9a | [
"CC0-1.0"
] | 10 | 2018-07-15T15:52:29.000Z | 2019-08-02T09:38:14.000Z | ---
output:
pdf_document: default
html_document: default
---
## Open Access and discoverability <a name="discoverability"></a>
Google Scholar is probably pretty much every researcher's go-to choice for academic search engines. And while it does a pretty good job of letting you know if you have access to a PDF or not, what it often does not tell you is whether these articles are in violation of copyright or not.
Open Access - true OA - articles do not violate copyright. And they can be more easily discovered and re-used as part of this.
There are now a cool array of what we might call 'open discovery engines' that help you to discover OA content amidst the millions of articles out there.
For example, [Open Knowledge Maps](https://openknowledgemaps.org/) provides a visual network-like interface integrated with [BASE](https://www.base-search.net/) (all disciplines) and [PubMed](https://www.ncbi.nlm.nih.gov/pubmed/) (life sciences).
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/J8QMa8K1daU" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
Other alternative discovery platforms include [ScienceOpen](https://www.scienceopen.com/), which we further discuss in the section on scholarly collaboration platforms.
...
There are a couple of steps that you as an individual can take to ensure that your own OA publications are discoverable by the variety of existing search engines out there.
In brief, these comprise the generation of standardized metadata that describe your work in a machine-readable way. Many of the existing repositories including [Humanities Commons](https://hcommons.org/), [Zenodo](https://zenodo.org/), and the Open Science Framework [osf.io](https://osf.io/) automatically do this for you, so we think it advisable to use these or other repositories (including those offered by your home institution) to make your work available online (see also Green OA).
Now, while metadata help make your work discoverable, they also have the additional benefit of increasing the **accessibility of your publication**, because support tools such as screen readers make extensive use of available metadata. Further to that, **reproducibility** of your work is improved through metadata. (see e.g. Kemp et al. 2018)
For more details on these, see e.g. the [JISC guidelines](https://www.jisc.ac.uk/guides/open-access-discovery-usage-and-impact).
### Further reading:
Bull, J. and Schultz, T.A., 2018. Harvesting the Academic Landscape: Streamlining the Ingestion of Professional Scholarship Metadata into the Institutional Repository. Journal of Librarianship and Scholarly Communication, 6(1), p.eP2201. DOI: [10.7710/2162-3309.2201](https://doi.org/10.7710/2162-3309.2201)
Kemp, J., Dean, C. & Chodacki, J. (2018) Can Richer Metadata Rescue Research?, The Serials Librarian, 74:1-4, 207-211, DOI: [10.1080/0361526X.2018.1428483](https://doi.org/10.1080/0361526X.2018.1428483)
| 83.777778 | 490 | 0.777188 | eng_Latn | 0.98665 |
fa46a745159193df76fe9057fcdc70f9f120d68e | 917 | md | Markdown | content/post/2018-ChemViews-PhD/index.md | chemist09/academic-webpage | 616317a684adf5a1c780a6c354824ce916254965 | [
"MIT"
] | null | null | null | content/post/2018-ChemViews-PhD/index.md | chemist09/academic-webpage | 616317a684adf5a1c780a6c354824ce916254965 | [
"MIT"
] | null | null | null | content/post/2018-ChemViews-PhD/index.md | chemist09/academic-webpage | 616317a684adf5a1c780a6c354824ce916254965 | [
"MIT"
] | null | null | null | ---
# Documentation: https://wowchemy.com/docs/managing-content/
title: ChemViews Article about Doing a PhD in two Countries
subtitle: Two countries, two supervisors, two labs, and one joint research project. While my home institution is in Germany, I spend more than one year of my time as a graduate student in Australia.
summary: In this ChemistryViews article, I shared my experience as a PhD student doing research in two research groups in Germany and Australia.
authors: []
tags: []
categories: []
date: 2018-04-03
featured: false
draft: false
# Featured image
# To use, add an image named `featured.jpg/png` to your page's folder.
# Focal points: Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight.
image:
caption: ""
focal_point: ""
preview_only: false
authors:
- admin
---
Find the ChemistryViews article here [ChemViews](https://doi.org/10.1002/chemv.201800011). | 33.962963 | 198 | 0.756816 | eng_Latn | 0.978549 |
fa472fa0fa82f7beb6a538188ad0de4f1973ac90 | 5,687 | md | Markdown | 6-Objects.md | abinavseelan/js-for-newbies-2 | dfa19bb3343f88d6b4f6b569b6e738679bfac0de | [
"MIT"
] | 1 | 2018-03-06T08:15:26.000Z | 2018-03-06T08:15:26.000Z | 6-Objects.md | abinavseelan/js-for-newbies-2 | dfa19bb3343f88d6b4f6b569b6e738679bfac0de | [
"MIT"
] | 1 | 2018-02-05T16:31:28.000Z | 2018-02-07T10:21:38.000Z | 6-Objects.md | abinavseelan/js-for-newbies-2 | dfa19bb3343f88d6b4f6b569b6e738679bfac0de | [
"MIT"
] | 2 | 2018-01-29T12:16:52.000Z | 2018-02-05T08:01:17.000Z | # Objects
An Object is an encapsulation of other data types. You can think of Objects as programmatic models of real-world entities.
## An Example
Say we want to describe a human in our program. A human has a name, an age and their preferred food choice. Using just variables, we would need to do something like this.
```javascript
let humanName = 'Tony Stark';
let humanAge = 40; // probably?
let humanFoodPreference = 'Shawarma';
let humanIsAnAvenger = true;
```
This does not scale well. The human here has just 4 properties. What if he/she has more? How can we semantically handle this?
And this...is where objects come in. An object is an encapsulation of data. Since we're defining properties for a human, a *human* can be an object that encapsulated properties (of other primitive data types) inside it, so that the data is semantically aggregated.
## Objects in Javascript
### Creating an object
To create an object, we use the `{}` brackets.
```javascript
let human = {};
```
Properites for the *human* are specified within the `{}` brackets.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
foodPreference: 'Shawarma',
isAnAvenger: true
};
```
### Storing arrays, functions and objects
Properties are stored as *key-value* pairs. In the above snippet, the keys would be *name*, *age*, *foodPreference* and *isAnAvenger*. A *key* can hold any value type, even arrays, functions and objects!
```javascript
let human = {
name: 'Tony Stark',
age: 40,
foodPreference: 'Shawarma',
isAnAvenger: true,
friends: ['Thor', 'Hawkeye', 'Steve Rogers', 'Bruce Banner'], // Array
says: function() { // Function
console.log("I'm Iron Man");
},
girlfriend: { // Another object
name: 'Pepper Potts',
age: 30,
isAnAvenger: false
}
};
```
### Accessing values
Objects use the `.` notation to access values stored in *keys*. For example, to access the `foodPreference* of this *human*.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
foodPreference: 'Shawarma',
isAnAvenger: true
};
console.log(human.foodPreference); // This will print Shawarma.
```
A *key* can also be more than one word. But to do that the *key* will have to be quoted.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
```
If the *key* is multiple words, the `.` notation will not work. Here you have to use `[]` to access the property.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
console.log(human['food preference']); // This will print Shawarma.
```
Nested objects are accessed the same way
```javascript
let human = {
name: 'Tony Stark',
age: 40,
foodPreference: 'Shawarma',
isAnAvenger: true,
girlfriend: { // Another object
name: 'Pepper Potts',
age: 30,
isAnAvenger: false
}
};
console.log(human.girlfriend.name); // This will print 'Pepper Potts'
```
**Pro Tip** 💡: You can also directly copy object properties to other variables. You can do this in two ways.
The first is by using the `.` notation.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
let name = human.name;
```
However, the preferred way to copy over the properties to varibles is through a javascript construct called object *destructuring*. For example, if we want access to the *name* and *age* properties, we can just do this
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
let { name, age } = human;
console.log(name); // This will print 'Tony Stark'
console.log(age); // This will print 40
```
Here the variables share the same name as the keys. We can also rename the variables as well
```javascript
let { name: N, age: A} = human;
console.log(N); // This will print 'Tony Stark'
console.log(A); // This will print 40
```
### Adding more properties
After an object is created, we can add more properties directly to the object using the `.` notation again.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
human.facialHair = true;
console.log(human);
```
This will print
```bash
{
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
facialHair: true
}
```
### Modifying existing properties
Similar to arrays, objects allow you to directly modify any value in an object property
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
human.age = 35;
console.log(age);
```
This will print the following to the console.
```bash
{
name: 'Tony Stark',
age: 38,
'food preference': 'Shawarma',
isAnAvenger: true
}
```
### Deleting properties
Properties, once added, can be deleted using the `delete` keyword.
```javascript
let human = {
name: 'Tony Stark',
age: 40,
'food preference': 'Shawarma',
isAnAvenger: true
};
delete human.isAnAvenger. // Shh...let's keep it a secret!
console.log(human);
```
This will print the following to the console
```bash
{
name: 'Tony Stark',
age: 38,
'food preference': 'Shawarma'
}
```
**Pro Tip** 💡: Did you know that arrays in Javascript are *actually* objects? 😱
### 🕵 Test your skills!
1) Try to model something you use on a daily basis, like a cellphone or a laptop as a javascript object. Add properties to it, delete properties from it. | 22.748 | 264 | 0.668191 | eng_Latn | 0.986561 |
fa47ad485e4e5624f1f9dfde5ea43e3080583809 | 3,563 | md | Markdown | documents/amazon-redshift-developer-guide/doc_source/r_WF_DENSE_RANK.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | 5 | 2021-08-13T09:20:58.000Z | 2021-12-16T22:13:54.000Z | documents/amazon-redshift-developer-guide/doc_source/r_WF_DENSE_RANK.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | null | null | null | documents/amazon-redshift-developer-guide/doc_source/r_WF_DENSE_RANK.md | siagholami/aws-documentation | 2d06ee9011f3192b2ff38c09f04e01f1ea9e0191 | [
"CC-BY-4.0"
] | null | null | null | # DENSE\_RANK window function<a name="r_WF_DENSE_RANK"></a>
The DENSE\_RANK window function determines the rank of a value in a group of values, based on the ORDER BY expression in the OVER clause\. If the optional PARTITION BY clause is present, the rankings are reset for each group of rows\. Rows with equal values for the ranking criteria receive the same rank\. The DENSE\_RANK function differs from RANK in one respect: If two or more rows tie, there is no gap in the sequence of ranked values\. For example, if two rows are ranked 1, the next rank is 2\.
You can have ranking functions with different PARTITION BY and ORDER BY clauses in the same query\.
## Syntax<a name="r_WF_DENSE_RANK-synopsis"></a>
```
DENSE_RANK () OVER
(
[ PARTITION BY expr_list ]
[ ORDER BY order_list ]
)
```
## Arguments<a name="r_WF_DENSE_RANK-arguments"></a>
\( \)
The function takes no arguments, but the empty parentheses are required\.
OVER
The window clauses for the DENSE\_RANK function\.
PARTITION BY *expr\_list*
Optional\. One or more expressions that define the window\.
ORDER BY *order\_list*
Optional\. The expression on which the ranking values are based\. If no PARTITION BY is specified, ORDER BY uses the entire table\. If ORDER BY is omitted, the return value is 1 for all rows\.
If ORDER BY doesn't produce a unique ordering, the order of the rows is nondeterministic\. For more information, see [Unique ordering of data for window functions](r_Examples_order_by_WF.md)\.
## Return type<a name="c_Supported_data_types_wf_dense_rank"></a>
INTEGER
## Examples<a name="r_WF_DENSE_RANK-examples"></a>
The following example orders the table by the quantity sold \(in descending order\), and assign both a dense rank and a regular rank to each row\. The results are sorted after the window function results are applied\.
```
select salesid, qty,
dense_rank() over(order by qty desc) as d_rnk,
rank() over(order by qty desc) as rnk
from winsales
order by 2,1;
salesid | qty | d_rnk | rnk
---------+-----+-------+-----
10001 | 10 | 5 | 8
10006 | 10 | 5 | 8
30001 | 10 | 5 | 8
40005 | 10 | 5 | 8
30003 | 15 | 4 | 7
20001 | 20 | 3 | 4
20002 | 20 | 3 | 4
30004 | 20 | 3 | 4
10005 | 30 | 2 | 2
30007 | 30 | 2 | 2
40001 | 40 | 1 | 1
(11 rows)
```
Note the difference in rankings assigned to the same set of rows when the DENSE\_RANK and RANK functions are used side by side in the same query\. For a description of the WINSALES table, see [Overview example for window functions](c_Window_functions.md#r_Window_function_example)\.
The following example partitions the table by SELLERID and orders each partition by the quantity \(in descending order\) and assign a dense rank to each row\. The results are sorted after the window function results are applied\.
```
select salesid, sellerid, qty,
dense_rank() over(partition by sellerid order by qty desc) as d_rnk
from winsales
order by 2,3,1;
salesid | sellerid | qty | d_rnk
---------+----------+-----+-------
10001 | 1 | 10 | 2
10006 | 1 | 10 | 2
10005 | 1 | 30 | 1
20001 | 2 | 20 | 1
20002 | 2 | 20 | 1
30001 | 3 | 10 | 4
30003 | 3 | 15 | 3
30004 | 3 | 20 | 2
30007 | 3 | 30 | 1
40005 | 4 | 10 | 2
40001 | 4 | 40 | 1
(11 rows)
```
For a description of the WINSALES table, see [Overview example for window functions](c_Window_functions.md#r_Window_function_example)\. | 40.033708 | 502 | 0.668538 | eng_Latn | 0.988163 |
fa47d20b47fa8dad411c2709ec7374f62cc3c556 | 149 | md | Markdown | jsonrpc/README.md | Mu-L/prisma-client-go | a6b87709e310f8fd4de86820027f2896d2b1998e | [
"Apache-2.0"
] | 42 | 2019-09-25T12:08:41.000Z | 2020-01-24T06:56:08.000Z | jsonrpc/README.md | Mu-L/prisma-client-go | a6b87709e310f8fd4de86820027f2896d2b1998e | [
"Apache-2.0"
] | 21 | 2019-09-25T13:22:27.000Z | 2020-01-27T14:13:20.000Z | jsonrpc/README.md | withevideo/prisma-client-go | d6e75e7f0bfcc9f270229df2e819021d9a42ddfa | [
"Apache-2.0"
] | 1 | 2020-01-23T06:00:02.000Z | 2020-01-23T06:00:02.000Z | # JSON-RPC
This package is needed for communication between the Prisma CLI and the invoked generator (generator being the whole Go client package).
| 37.25 | 136 | 0.805369 | eng_Latn | 0.999379 |
fa48f39060a150068dcbbe3e73b165234a2577c0 | 6,395 | md | Markdown | sdk-api-src/content/winreg/nf-winreg-regcreatekeyw.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/winreg/nf-winreg-regcreatekeyw.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/winreg/nf-winreg-regcreatekeyw.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:winreg.RegCreateKeyW
title: RegCreateKeyW function (winreg.h)
description: Creates the specified registry key. If the key already exists in the registry, the function opens it.
helpviewer_keywords: ["RegCreateKey","RegCreateKey function","RegCreateKeyA","RegCreateKeyW","_win32_regcreatekey","base.regcreatekey","winreg/RegCreateKey","winreg/RegCreateKeyA","winreg/RegCreateKeyW"]
old-location: base\regcreatekey.htm
tech.root: winprog
ms.assetid: cb4d30f4-e288-41e8-86e0-807c313db53d
ms.date: 12/05/2018
ms.keywords: RegCreateKey, RegCreateKey function, RegCreateKeyA, RegCreateKeyW, _win32_regcreatekey, base.regcreatekey, winreg/RegCreateKey, winreg/RegCreateKeyA, winreg/RegCreateKeyW
req.header: winreg.h
req.include-header: Windows.h
req.target-type: Windows
req.target-min-winverclnt: Windows 2000 Professional [desktop apps only]
req.target-min-winversvr: Windows 2000 Server [desktop apps only]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi: RegCreateKeyW (Unicode) and RegCreateKeyA (ANSI)
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib: Advapi32.lib
req.dll: Advapi32.dll
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- RegCreateKeyW
- winreg/RegCreateKeyW
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- DllExport
api_location:
- Advapi32.dll
- API-MS-Win-Core-Registry-l2-1-0.dll
- advapi32legacy.dll
- API-MS-Win-Core-Registry-l2-2-0.dll
api_name:
- RegCreateKey
- RegCreateKeyA
- RegCreateKeyW
---
# RegCreateKeyW function
## -description
Creates the specified registry key. If the key already exists in the registry, the function opens it.
<div class="alert"><b>Note</b> This function is provided only for compatibility with 16-bit versions of Windows. Applications should use the
<a href="/windows/desktop/api/winreg/nf-winreg-regcreatekeyexa">RegCreateKeyEx</a> function. However, applications that back up or restore system state including system files and registry hives should use the <a href="/windows/win32/vss/volume-shadow-copy-service-overview">Volume Shadow Copy Service</a> instead of the registry functions.</div><div> </div>
## -parameters
### -param hKey [in]
A handle to an open registry key. The calling process must have KEY_CREATE_SUB_KEY access to the key. For more information, see
<a href="/windows/desktop/SysInfo/registry-key-security-and-access-rights">Registry Key Security and Access Rights</a>.
Access for key creation is checked against the security descriptor of the registry key, not the access mask specified when the handle was obtained. Therefore, even if <i>hKey</i> was opened with a <i>samDesired</i> of KEY_READ, it can be used in operations that create keys if allowed by its security descriptor.
This handle is returned by the
<a href="/windows/desktop/api/winreg/nf-winreg-regcreatekeyexa">RegCreateKeyEx</a> or
<a href="/windows/desktop/api/winreg/nf-winreg-regopenkeyexa">RegOpenKeyEx</a> function, or it can be one of the following
<a href="/windows/desktop/SysInfo/predefined-keys">predefined keys</a>:<dl>
<dd><b>HKEY_CLASSES_ROOT</b></dd>
<dd><b>HKEY_CURRENT_CONFIG</b></dd>
<dd><b>HKEY_CURRENT_USER</b></dd>
<dd><b>HKEY_LOCAL_MACHINE</b></dd>
<dd><b>HKEY_USERS</b></dd>
</dl>
### -param lpSubKey [in, optional]
The name of a key that this function opens or creates. This key must be a subkey of the key identified by the <i>hKey</i> parameter.
For more information on key names, see <a href="/windows/desktop/SysInfo/structure-of-the-registry">Structure of the Registry</a>.
If <i>hKey</i> is one of the predefined keys, <i>lpSubKey</i> may be <b>NULL</b>. In that case, <i>phkResult</i> receives the same <i>hKey</i> handle passed in to the function.
### -param phkResult [out]
A pointer to a variable that receives a handle to the opened or created key. If the key is not one of the predefined registry keys, call the
<a href="/windows/desktop/api/winreg/nf-winreg-regclosekey">RegCloseKey</a> function after you have finished using the handle.
## -returns
If the function succeeds, the return value is ERROR_SUCCESS.
If the function fails, the return value is a nonzero error code defined in Winerror.h. You can use the
<a href="/windows/desktop/api/winbase/nf-winbase-formatmessage">FormatMessage</a> function with the FORMAT_MESSAGE_FROM_SYSTEM flag to get a generic description of the error.
## -remarks
An application cannot create a key that is a direct child of <b>HKEY_USERS</b> or <b>HKEY_LOCAL_MACHINE</b>. An application can create subkeys in lower levels of the <b>HKEY_USERS</b> or <b>HKEY_LOCAL_MACHINE</b> trees.
If your service or application impersonates different users, do not use this function with <b>HKEY_CURRENT_USER</b>. Instead, call the <a href="/windows/desktop/api/winreg/nf-winreg-regopencurrentuser">RegOpenCurrentUser</a> function.
The <b>RegCreateKey</b> function creates all missing keys in the specified path. An application can take advantage of this behavior to create several keys at once. For example, an application can create a subkey four levels deep at the same time as the three preceding subkeys by specifying a string of the following form for the <i>lpSubKey</i> parameter:
<i>subkey1\subkey2\subkey3\subkey4
</i>
Note that this behavior will result in creation of unwanted keys if an existing key in the path is spelled incorrectly.
> [!NOTE]
> The winreg.h header defines RegCreateKey as an alias which automatically selects the ANSI or Unicode version of this function based on the definition of the UNICODE preprocessor constant. Mixing usage of the encoding-neutral alias with code that not encoding-neutral can lead to mismatches that result in compilation or runtime errors. For more information, see [Conventions for Function Prototypes](/windows/win32/intl/conventions-for-function-prototypes).
## -see-also
<a href="/windows/desktop/api/winreg/nf-winreg-regclosekey">RegCloseKey</a>
<a href="/windows/desktop/api/winreg/nf-winreg-regcreatekeyexa">RegCreateKeyEx</a>
<a href="/windows/desktop/api/winreg/nf-winreg-regdeletekeya">RegDeleteKey</a>
<a href="/windows/desktop/api/winreg/nf-winreg-regopenkeyexa">RegOpenKeyEx</a>
<a href="/windows/desktop/SysInfo/registry-functions">Registry Functions</a>
<a href="/windows/desktop/SysInfo/registry">Registry Overview</a> | 43.209459 | 459 | 0.773729 | eng_Latn | 0.870793 |
fa49260522662a056e8af2487d8d9fc2c273db75 | 2,932 | md | Markdown | content/publication/gasstruct/index.md | marco-duarte/Website | 632fa7d069c5c91f01cb245ffd07e537d2534c8a | [
"MIT"
] | null | null | null | content/publication/gasstruct/index.md | marco-duarte/Website | 632fa7d069c5c91f01cb245ffd07e537d2534c8a | [
"MIT"
] | null | null | null | content/publication/gasstruct/index.md | marco-duarte/Website | 632fa7d069c5c91f01cb245ffd07e537d2534c8a | [
"MIT"
] | null | null | null | ---
# Documentation: https://wowchemy.com/docs/managing-content/
title: "Hub-and-Spoke Collusion with Horizontally Differentiated Spokes"
authors: ["Marco Duarte", "Daniel Chaves"]
date: 2021-12-05T12:33:09-06:00
doi: ""
# Schedule page publish date (NOT publication's date).
#publishDate: 2021-03-09T12:33:09-06:00
# Publication type.
# Legend: 0 = Uncategorized; 1 = Conference paper; 2 = Journal article;
# 3 = Preprint / Working Paper; 4 = Report; 5 = Book; 6 = Book section;
# 7 = Thesis; 8 = Patent
publication_types: ["0"]
# Publication name and optional abbreviated publication name.
publication: ""
publication_short: "JMP"
abstract: "A hub-and-spoke cartel, where firms (spokes) limit competition with the help of an upstream supplier(hub), is a type of collusive arrangement observed in a variety of industries. In most cases, spokes compensate the hub's help by excluding its rivals. Under those circumstances, how do hub and spokes divide the rents from collusion? We study a hub-and-spoke cartel with an exclusion condition between gas stations and distributors in the gasoline market of Brazil's Federal District. Using a structural model of demand, we estimate the gas stations' incentive to collude for different splits of rents. We show that, although more rents to distributors increased the stations' incentive to deviate from supplier, it also decreased their incentive to deviate on prices. In a counterfactual scenario where retailers extract all the rents from collusion, the cartel would need to decrease markups by 24% not to trigger price deviations. Another counterfactual points out that banning exclusive dealing contracts between stations and distributors would have destabilized the retail price coordination."
# Summary. An optional shortened abstract.
summary: ""
tags: []
categories: []
featured: false
# Custom links (optional).
# Uncomment and edit lines below to show custom links.
# links:
# - name: Follow
# url: https://twitter.com
# icon_pack: fab
# icon: twitter
url_pdf:
url_code:
url_dataset:
url_poster:
url_project:
url_slides:
url_source:
url_video:
# Featured image
# To use, add an image named `featured.jpg/png` to your page's folder.
# Focal points: Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight.
image:
caption: ""
focal_point: ""
preview_only: false
# Associated Projects (optional).
# Associate this publication with one or more of your projects.
# Simply enter your project's folder or file name without extension.
# E.g. `internal-project` references `content/project/internal-project/index.md`.
# Otherwise, set `projects: []`.
projects: []
# Slides (optional).
# Associate this publication with Markdown slides.
# Simply enter your slide deck's filename without extension.
# E.g. `slides: "example"` references `content/slides/example/index.md`.
# Otherwise, set `slides: ""`.
slides: ""
---
| 41.885714 | 1,109 | 0.757844 | eng_Latn | 0.979352 |
fa49936b2b7b609cc13b7f6d41bc16d7e1da4854 | 441 | md | Markdown | content/blog/a-future-we-can-vibe-with.md | daniellehoo/gatsby_dh | 501b4f2df7ae79f834b57e27079d1707cf0a09d0 | [
"MIT"
] | null | null | null | content/blog/a-future-we-can-vibe-with.md | daniellehoo/gatsby_dh | 501b4f2df7ae79f834b57e27079d1707cf0a09d0 | [
"MIT"
] | 2 | 2022-02-26T02:31:14.000Z | 2022-02-28T01:34:41.000Z | content/blog/a-future-we-can-vibe-with.md | daniellehoo/blog | 501b4f2df7ae79f834b57e27079d1707cf0a09d0 | [
"MIT"
] | null | null | null | ---
title: 'A Future We Can Vibe With'
date: '2017-08-12T01:52:17+00:00'
status: publish
permalink: /a-future-we-can-vibe-with
author: Danielle
excerpt: ''
type: post
id: 1048
category:
- Uncategorized
tag: []
post_format: []
---
[](http://www.daniellehoo.com/wp-content/uploads/2017/09/Screen-Shot-2017-09-11-at-9.51.37-PM.png) | 29.4 | 207 | 0.714286 | yue_Hant | 0.112991 |
fa49cac75e16dc7badd7939c2a8737c34a35b3e1 | 1,240 | md | Markdown | content/publication/oliveira-2017-generic/index.md | luizvbo/academic-kickstart | 7d2593cedf63026c8c06bd028ceba19386317248 | [
"MIT"
] | null | null | null | content/publication/oliveira-2017-generic/index.md | luizvbo/academic-kickstart | 7d2593cedf63026c8c06bd028ceba19386317248 | [
"MIT"
] | null | null | null | content/publication/oliveira-2017-generic/index.md | luizvbo/academic-kickstart | 7d2593cedf63026c8c06bd028ceba19386317248 | [
"MIT"
] | null | null | null | ---
title: A Generic Framework for Building Dispersion Operators in the Semantic Space
date: '2017-01-01'
publishDate: '2021-05-04T19:15:52.077627Z'
authors:
- Luiz Otavio V. B. Oliveira
- Fernando E. B. Otero
- Gisele L. Pappa
publication_types:
- '6'
abstract: 'This chapter proposes a generic framework to build geometric dispersion
(GD) operators for Geometric Semantic Genetic Programming in the context of symbolic
regression, followed by two concrete instantiations of the framework: a multiplicative
geometric dispersion operator and an additive geometric dispersion operator. These
operators move individuals in the semantic space in order to balance the population
around the target output in each dimension, with the objective of expanding the
convex hull defined by the population to include the desired output vector. An experimental
analysis was conducted in a testbed composed of sixteen datasets showing that dispersion
operators can improve GSGP search and that the multiplicative version of the operator
is overall better than the additive version.'
featured: false
publication: '*Genetic Programming Theory and Practice XIV*'
url_pdf: https://link.springer.com/chapter/10.1007/978-3-319-97088-2_12
---
| 47.692308 | 93 | 0.8 | eng_Latn | 0.990272 |
fa4a05cf248caec60333bc78616acb3550471bd4 | 97 | md | Markdown | _posts/201/2018-03-26-week-9.md | CloudCoppola/cloudcoppola.github.io | a45367f794c3af0a83ad76707e8e133755d57d32 | [
"MIT"
] | null | null | null | _posts/201/2018-03-26-week-9.md | CloudCoppola/cloudcoppola.github.io | a45367f794c3af0a83ad76707e8e133755d57d32 | [
"MIT"
] | null | null | null | _posts/201/2018-03-26-week-9.md | CloudCoppola/cloudcoppola.github.io | a45367f794c3af0a83ad76707e8e133755d57d32 | [
"MIT"
] | null | null | null | ---
layout: post
title: Week 9
---
26/03/2018 - I did not work on the group project this week.
| 13.857143 | 59 | 0.659794 | eng_Latn | 0.999554 |
fa4a167631beb0cd604a023a38894c4787537f1d | 424 | md | Markdown | .github/ISSUE_TEMPLATE.md | kakasoo/gachonAPI | 80cee2be80e47902e5756a1e7ed2c88693256ba0 | [
"MIT"
] | 1 | 2021-08-09T12:27:59.000Z | 2021-08-09T12:27:59.000Z | .github/ISSUE_TEMPLATE.md | kakasoo/gachonAPI | 80cee2be80e47902e5756a1e7ed2c88693256ba0 | [
"MIT"
] | null | null | null | .github/ISSUE_TEMPLATE.md | kakasoo/gachonAPI | 80cee2be80e47902e5756a1e7ed2c88693256ba0 | [
"MIT"
] | null | null | null | ## 💡 요구 API
- 해당 이슈의 내용과 기대되는 결과를 적어주세요.
- 새로운 resource에 대한 API를 말씀해주셔도 상관없습니다만, 크롤링 외 학교 서버를 거쳐야 하는 경우는 어려울 수 있습니다.
## 🚨 상세 설명
- 가능하시다면, request와 response의 상세 설명을 작성해주세요.
- 작성해주시지 않으면 임의로 생각하여 만들 수 밖에 없습니다.
- ex : 어떤 쿠키, 헤더 등을 제공할 것이며 데이터의 형태는 { length : 1, data : [] }의 형태이길 희망합니다.
## ✅ 체크리스트
- 상세 설명의 request, response 설명 외 고려해야 할 사항을 추가해주시면 좋겠습니다.
- [ ] 반드시 포함되어야 할 사항1
- [ ] 반드시 포함되어야 할 사항2
- [ ] 반드시 포함되어야 할 사항3
| 28.266667 | 79 | 0.639151 | kor_Hang | 1.00001 |
fa4aac3def393f42410bcb4f44cc53e287116884 | 210 | md | Markdown | docs/src/lang/modules.md | JonathanLorimer/apalache | e3b35e114247a21d3f25d5f0712a9ec6f7e01b41 | [
"Apache-2.0"
] | 155 | 2020-07-26T13:11:11.000Z | 2022-03-30T19:38:31.000Z | docs/src/lang/modules.md | JonathanLorimer/apalache | e3b35e114247a21d3f25d5f0712a9ec6f7e01b41 | [
"Apache-2.0"
] | 1,027 | 2020-06-26T20:44:31.000Z | 2022-03-31T17:06:27.000Z | docs/src/lang/modules.md | JonathanLorimer/apalache | e3b35e114247a21d3f25d5f0712a9ec6f7e01b41 | [
"Apache-2.0"
] | 12 | 2020-11-18T19:25:08.000Z | 2022-03-17T23:22:28.000Z | # Modules and instances
Work in progress... For the moment, check Chapters 2 and 4 of [Specifying
Systems].
[Specifying Systems]: http://lamport.azurewebsites.net/tla/book.html?back-link=user-operators.html
| 26.25 | 98 | 0.771429 | kor_Hang | 0.36627 |